Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 81 Research in sport management o ft en defaults to cross ‐sec ti onal surveys analyzed with regression or structural equa ti on mod ‐ eling, even when ques ti ons and data environments call for temporal, experimental, ethnographic, or integrated designs. This paper addresses that misalignment by advancing method–ques ti on fit as the organizing principle for design in sport. We (a) synthesize how quan ti ta ti ve, qualita ti ve, and mixed methods are actually used in sport and where each is strongest; (b) present a five ‐step, sport ‐specific decision framework (align ques ti on–theory, audit data/access, balance epistemology and feasibility, plan ethics by design, and integrate methods for innova ti on); and (c) consolidate guardrails for quality (psycho ‐ metrics, model fit and invariance, qualita ti ve trustworthiness, and mixed ‐methods integra ti on). Theore ti cally, we ar ti culate a sport ‐specific eviden ti ary logic, an integra ti on blueprint joining variance and process explana ti ons, and a reliability/trans ‐ portability charter suited to proprietary data contexts. We conclude with prac ti cal implica ti ons for organiza ti ons and training, and a future research agenda emphasizing longitudinal, experimental, ethnographic, and mixed ‐methods programs. Keywords: Sport Management, Research Design, Mixed Methods, Decision Framework I HAVE A RESEARCH IDEA, NOW WHAT? A PRACTICAL GUIDE TO RESEARCH METHOD SELECTION IN SPORT MANAGEMENT Jure Andolšek School of Economics and Business, University of Ljubljana, Slovenia jure.andolsek@ef.uni ‐lj.si Abstract Methodological rigor determines the credibility of findings that guide decisions on sponsorship and ac ti va ti on budgets, season‐ti cket pricing, fan ‐en‐ gagement strategy, athlete and employee well ‐ being ini ti a ti ves, governance and compliance, and community sport investment (Cornwell, 2013). Sport’s dis ti nc ti ve features (e.g., simultaneous co ‐ opera ti on and compe titi on, strong emo ti onal iden ‐ ti fica ti on among fans, the co ‐produc ti on of experiences by consumers and organiza ti ons) create design challenges not always present in other indus ‐ tries (Smith & Stewart, 2010). These condi ti ons complicate sampling, measurement, causal infer ‐ ence, and ethics, making careful alignment between research ques ti ons and methods essen ti al. A further shi ft intensifies both the opportunity and the responsibility to choose wisely: sport orga ‐ niza ti ons now generate extensive digital traces; transacti onal ti cke ti ng, dynamic pricing histories, app and web analy ti cs, social media engagement, and, in high ‐performance contexts, wearable and 1 INTRODUCTION Research in sport management has matured rapidly over the last three decades, expanding across consumer behavior, sponsorship, gover ‐ nance, organiza ti onal behavior, social responsibility, digital pla tf orms, and event management (Doherty, 2013; Smith & Stewart, 2010). Journals such as Jour ‐ nal of Sport Management, Sport Management Re ‐ view, and European Sport Management Quarterly document this breadth and its growing methodolog ‐ ical sophis ti ca ti on. Yet method selec ti on in pub ‐ lished studies remains uneven. Many projects default to cross ‐sec ti onal surveys analyzed with re ‐ gression or structural equa ti on modeling even when the ques ti on, context, or available data call for al ‐ terna ti ve designs be tt er suited to inference. Con ‐ versely, qualita ti ve designs are some ti mes adopted without clear links to epistemological stance or an ‐ aly ti c rigor, and mixed methods remain underused despite sport management’s inherently mul ti‐ level, stakeholder ‐rich se tti ngs (Filo, Lock, & Karg, 2015). Vol. 14, No. 2, 81 ‐93 doi:10.17708/DRMJ.2025.v14n02a06 Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 Jure Andolšek: I Have a Research Idea, Now What? A Prac ti cal Guide to Research Method Selec ti on in Sport Management 82 biometric data. These sources can support designs beyond single ‐wave self ‐reports, including longitu ‐ dinal panels, event ‐history models of churn, quasi ‐ experiments around staggered rollouts, and field experiments embedded in communica ti ons. At the same ti me, access to boardrooms, back ‐of ‐house opera ti ons, and online fan communi ti es creates op ‐ portuni ti es for ethnographic and case ‐based in ‐ sights that surveys cannot capture (Hammersley & Atkinson, 2019; Washington & Pa tt erson, 2011). De ‐ spite these opportuni ti es, publica ti on pa tt erns s ti ll reflect a narrower methodological repertoire than the field’s ques ti ons (and data) would support. This paper addresses that misalignment by of ‐ fering a prac ti cal, domain ‐specific guide to method selec ti on in sport management. The central gap is twofold. Substan ti vely, there is a pa tt erned over ‐re ‐ liance on cross ‐sec ti onal self ‐report surveys for ques ‐ ti ons that are temporal, rela ti onal, or processual in nature, for which longitudinal, experimental, ethno ‐ graphic, or mixed ‐methods designs would yield stronger evidence. Methodologically, the field lacks consolidated sport ‐specific guidance that translates general research design principles into the con ‐ straints and opportuni ti es of sport organiza ti ons. General methods texts provide founda ti ons (e.g., Creswell & Creswell, 2017; Kline, 2023), but re ‐ searchers s ti ll lack a clear mapping from sport man ‐ agement ques ti ons to feasible, defensible designs. Our contribu ti ons are prac ti cal and theore ti cal. Prac ti cally, we (i) synthesize how quan ti ta ti ve, qual ‐ ita ti ve, and mixed ‐methods approaches are actually used in sport and iden ti fy where each is strongest, grounding the discussion in influen ti al scholarship and drawing out the design logics that make those contribu ti ons credible; (ii) present a five ‐step deci ‐ sion framework tailored to sport (align ques ti on– theory; audit data/access; balance epistemology and feasibility; plan ethics by design; integrate methods for innova ti on); and (iii) consolidate sport specific guardrails for quality—psychometric repor ti ng (reli ‐ ability; convergent/discriminant validity; measure ‐ ment invariance), model assessment and parsimony in SEM, mi ti ga ti on of common method variance, qualita ti ve trustworthiness, and integra ti on stan ‐ dards for mixed methods; so researchers can design ex ante for rigor rather than retrofit diagnos ti cs ex post (Podsako ff et al., 2003; Henseler et al., 2015). Theore ti cally, we advance three ideas. First, we propose a sport ‐specific mapping from ques ti on types to eviden ti ary standards, linking prevalent con ‐ structs (e.g., iden ti fica ti on, perceived value, experi ‐ ence quality, brand associa ti ons, psychological safety) to designs required to adjudicate rival expla ‐ na ti ons. Second, we clarify how integra ti on across methods enhances explana ti on: quan ti ta ti ve models iden ti fy pa tt erned rela ti onships; qualita ti ve analyses reveal mechanisms and con ti ngencies; mixed ‐meth ‐ ods integra ti on yields meta ‐inferences that travel across organiza ti ons and cultures (Johnson & On ‐ wuegbuzie, 2004; Venkatesh et al., 2013). Third, we outline a reliability and transportability charter for access ‐constrained sport research, advoca ti ng de ‐ sign transparency, preregistra ti on where feasible, in ‐ strument and code sharing within contractual limits, and explicit discussion of what is likely to generalize across clubs, leagues, and contexts (Hair, Black, Babin, & Anderson, 2019; Miles et al., 2014). 2 UNDERSTANDING RESEARCH METHODS IN SPORT MANAGEMENT 2.1 The three families: principles of design, data, and inference Sport management research relies on three methodological families: quan ti ta ti ve, qualita ti ve, and mixed methods, each with its own logic of evidence and inference. Quan ti ta ti ve designs are used to test hy ‐ potheses, es ti mate rela ti onships, and assess e ffects with numeric data; they priori ti ze measurement validity and sta ti s ti cal inference and are typically opera ti onal ‐ ized through structured instruments, archival datasets, or controlled manipula ti ons (Fischer et al., 2023; Field, 2024). Qualita ti ve designs are used to examine pro ‐ cesses, meanings, and contexts through interviews, ob ‐ serva ti on, and documents; they emphasize depth, reflexivity, and trustworthiness, and they are indispens ‐ able when researchers seek to understand mecha ‐ nisms, interpreta ti ons, or organiza ti onal dynamics not easily captured in standardized measures (Maxwell, 2013; Pa tt on, 2015). Mixed ‐methods designs purpose ‐ fully integrate both tradi ti ons (sequen ti ally or concur ‐ rently) to triangulate findings and produce more complete explana ti ons when ques ti ons span both pat ‐ terned rela ti onships and underlying processes (Johnson & Onwuegbuzie, 2004; Tashakkori & Teddlie, 2010). Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 83 Method selec ti on should follow the logic of the research ques ti on. When the goal is to es ti mate the strength or direc ti on of rela ti onships and test direc ‐ ti onal hypotheses, for example the e ffect of per ‐ ceived sponsor–team fit on purchase inten ti on, quan ti ta ti ve models are the appropriate eviden ti ary route. Where the aim is to understand how gover ‐ nance reforms unfold, or why fans co ‐create par ti c ‐ ular meanings around teams, qualita ti ve designs are be tt er suited. Many sport management ques ti ons, however, have both elements: they require es ti ma ‐ ti on of rela ti onships that generalize to broader pop ‐ ula ti ons and explana ti on of processes that vary across contexts. In these cases, mixed ‐methods de ‐ signs are warranted. Across the three families, data collec ti on deci ‐ sions determine what can credibly be inferred. Quan ti ta ti ve work in sport typically relies on struc ‐ tured surveys/ques ti onnaires; archival and admin ‐ istra ti ve sources such as ti cke ti ng, pricing histories, and a tt endance records; digital traces from web, app, and social media analy ti cs; and, where feasible, experiments in laboratory, online, or field se tti ngs. Good prac ti ce begins with construct clarity, careful item development or adapta ti on, pilot tes ti ng, and transparent repor ti ng of sampling frames and re ‐ sponse rates (Hinkin, 1998; DeVellis, 2021). Qualita ‐ ti ve work typically draws on semi ‐structured interviews, observa ti ons and ethnography, internal documents and communica ti ons, and digital ar ti ‐ facts; sampling is purposeful rather than probabilis ‐ ti c, with explicit strategies for access, diversity of perspec ti ves, and ethical protec ti on, especially where power asymmetries are pronounced (Pa tt on, 2015; Hammersley & Atkinson, 2019). Mixed ‐meth ‐ ods projects must plan integra ti on from the outset; for example, using an ini ti al survey to iden ti fy seg ‐ ments for qualita ti ve follow ‐up (explanatory se ‐ quen ti al), building a survey instrument from qualita ti ve codes (exploratory sequen ti al), or col ‐ lec ti ng both strands concurrently and integra ti ng them analy ti cally (convergent). Once data are collected, analysis proceeds along well ‐established routes. In quan ti ta ti ve stud ‐ ies this can involve anything from descrip ti ve sta ti s ‐ ti cs (Nick, 2007) through linear or logis ti c regression (Christensen, 1997) and ANOVA/ANCOVA (Ruther ‐ ford, 2011) to structural equa ti on modeling (Yuan & Bentler, 2006). Researchers should report internal consistency, convergent and discriminant validity, model fit indices, and, when comparing groups, tests of measurement invariance; they should de ‐ sign for and diagnose common method variance in single ‐source designs (Fornell & Larcker, 1981; Henseler et al., 2015; Podsako ff et al., 2003). Qual ‐ ita ti ve analysis commonly employs thema ti c analy ‐ sis (Terry et al., 2017), constant comparison (Leech & Onwuegbuzie, 2011), and case ‐based logics; cred ‐ ibility is strengthened through member checking, audit trails, reflexive memos, and thick descrip ti on (Braun & Clarke, 2006; Miles et al., 2014). Mixed ‐ methods analysis requires explicit integra ti on to avoid parallel narra ti ves and to achieve genuine complementarity (Venkatesh et al., 2013). Validity, ethics, and feasibility have sport ‐specific contours (Robertson et al., 2017). Proprietary fan lists and event ‐based intercept sampling introduce cover ‐ age and nonresponse biases; organiza ti onal gate ‐ keeping constrains access to data and people; sensi ti ve topics such as integrity, safeguarding, and employee well ‐being elevate ethical risks. Quan ti ta ‐ ti ve work should avoid causal language without ap ‐ propriate designs (e.g., longitudinal or experimental) and report sampling and invariance transparently. Qualita ti ve work should foreground researcher posi ‐ ti onality and par ti cipant protec ti ons in hierarchically structured se tti ngs. Mixed ‐methods projects must se ‐ quence realis ti cally given club and league ti metables and resource trade ‐o ffs, and they should document how integra ti on informed interpreta ti on and recom ‐ menda ti ons. In all cases, a defensible design in sport requires explicit a tt en ti on to method–ques ti on fit, data access and quality, and the ethical implica ti ons of studying passionate publics and vulnerable stake ‐ holders (Andrew et al., 2019). 2.2 Methods most used in sport management Within sport management, several methods recur because they align well with common ques ‐ ti ons and available data (Veal & Darcy, 2014). Cross ‐ sec ti onal surveys analyzed with regression or SEM are dominant in consumer research, service and ex ‐ perience quality, brand and loyalty, and sponsorship e ffec ti veness; interviews and case studies are preva ‐ lent in governance, leadership, and organiza ti onal Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 Jure Andolšek: I Have a Research Idea, Now What? A Prac ti cal Guide to Research Method Selec ti on in Sport Management 84 change; ethnography and thema ti c analysis appear where lived experience and culture are central; and mixed ‐methods designs emerge when researchers connect pa tt erned outcomes with process explana ‐ ti ons (Trail & James, 2001; Greenwell et al., 2002; Yoshida & James, 2010; Wicker et al., 2013). In quan ti ta ti ve consumer and sponsorship work, researchers typically use multi‐ item scales to opera ‐ ti onalize constructs such as mo ti va ti on, iden ti fica ti on, perceived value, brand associa ti ons, and perceived sponsor–team fit (Olson et al., 2011). These are vali ‐ dated through factor ‐analy ti c procedures and mod ‐ eled via regression or SEM to es ti mate direct and indirect e ffects on sa ti sfac ti on, word ‐of ‐mouth, and behavioral inten ti ons. The strength of this approach lies in clear construct–indicator mapping and the abil ‐ ity to test theore ti cally specified pathways; its main limita ti ons are reliance on single ‐wave self ‐reports, vulnerability to common method variance, and re ‐ stricted causal inference (Fornell & Larcker, 1981; Pod ‐ sako ff et al., 2003). Where organiza ti ons grant access to transac ti onal or engagement data, researchers can augment surveys with behavioral outcomes (re ‐ newals, purchases, a tt endance), adopt longitudinal designs, or embed field experiments in communica ‐ ti ons, thereby strengthening inference. In governance and organiza ti onal studies, qual ‐ ita ti ve interviews and case studies enable re ‐ searchers to access decision processes, role nego ti a ti ons, and strategic capability building within sport organiza ti ons (Morse & McEvoy, 2014). These designs allow triangula ti on of interviews with ob ‐ serva ti ons and internal documents, producing con ‐ text ‐rich explana ti ons of change and performance. Their strengths are depth and ecological validity; their limita ti ons include challenges to generaliza ti on and the need for reflexive, transparent analy ti c pro ‐ cedures to ensure credibility and transferability (Yin, 2018; Miles et al., 2014). Ethnographic approaches extend this depth by immersing researchers in match day opera ti ons, volunteer management, or online fan communi ti es, revealing tacit norms and emo ti onal labor that surveys rarely capture (Ham ‐ mersley & Atkinson, 2019). Mixed ‐methods studies are par ti cularly well suited to sport because many managerial problems involve a tti tudes and meanings (captured qualita ‐ ti vely) as well as behaviors and outcomes (captured quan ti ta ti vely; Rudd & Johnson, 2010). Explanatory sequen ti al designs can begin with large ‐scale model ‐ ing of sa ti sfac ti on and renewal, followed by interviews with atypical cases to diagnose barriers; exploratory sequen ti al designs can build new measures of board capability from qualita ti ve insights and then general ‐ ize via survey; convergent designs can contrast what fans report about sponsorship engagement with how managers describe ac ti va ti on decisions. Integra ti on (rather than mere coexistence) should be the hallmark of such projects, with joint displays and explicit meta ‐ inferences guiding recommenda ti ons (Johnson & On ‐ wuegbuzie, 2004; Venkatesh et al., 2013). To crystallize these pa tt erns, Table 1 links influ ‐ en ti al sport management studies to their principal design choices (study design, data collec ti on, ana ‐ ly ti c approach) and adds brief notes on strengths and limita ti ons relevant to method selec ti on. This mapping illustrates why certain methods became dominant in sport management and where their boundaries lie. Survey ‐based SEM, for instance, has been excep ti onally produc ti ve in clarifying the structure of fan experience, brand associa ti ons, and loyalty drivers. At the same ti me, governance and or ‐ ganiza ti onal change have required designs capable of opening the “black box” of process; interviews, case studies, and, where feasible, ethnography. Finally, where organiza ti ons provide behavioral data or per ‐ mit interven ti on, the field can progress beyond asso ‐ cia ti on to stronger inference through longitudinal, quasi ‐experimental, or experimental designs, and through mixed ‐methods integra ti on that connects pa tt erns to mechanisms. Together, these insights pro ‐ vide a domain ‐specific founda ti on for selec ti ng meth ‐ ods that fit the ques ti on, the data environment, and the ethical constraints of sport organiza ti ons. 3 DECISION-MAKING FRAMEWORK FOR METHOD SELECTION IN SPORT MANAGEMENT We propose a five ‐step framework for trans ‐ parent, defensible method selec ti on tailored to sport contexts. Although presented sequen ti ally, these steps are itera ti ve in prac ti ce: researchers move back and forth as access evolves, ethical is ‐ Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 85 Biscaia et al. (2013) Quan ti ta ti ve survey; cross ‐sec ti onal Survey of fans from a professional soccer team (loyalty, sponsorship awareness, purchase inten ti on) SEM Integrates value percep ti ons; cross ‐sec ti onal limits on causality Cornwell (2013) Conceptual/theore ti cal review and synthesis No primary data; draws on prior literature Cri ti cal literature review and conceptual integra ti on Provides a high ‐level synthesis of sponsorship research; applicability to specific context may be limited Filo, Lock, & Karg (2015) Systema ti c literature review Reviewed 70 peer ‐ reviewed journal ar ti cles in English ‐ language sport management journals on social media and sport Categoriza ti on of studies into three domains: (1) strategic, (2) opera ti onal, and (3) user ‐ focused. Comprehensive review provides a structured overview of the field; limited to the sport management journals Greenwell, Fink, & Pastore (2002) Quan ti ta ti ve survey; cross ‐sec ti onal study Survey of 218 minor league ice hockey spectators Mul ti ple regression and hierarchical regression analyses Empirically examines the rela ti ve importance of physical facili ti es within broader service experience; single sport context limits generalizability Gwinner & Benne tt (2008) Quan ti ta ti ve; cross ‐ sec ti onal survey Survey of 552 a tt endees at the Dew Ac ti on Sports Tour SEM Expands sponshorship literature by shi ft ing focus from outcomes of fit to predictors of fit; single event and loca ti on may limit generalizability Kunkel, Funk, & Hill (2013) Quan ti ta ti ve; cross ‐ sec ti onal survey Online ques ti onnaire of football consumers (n = 752) CFA, MANOVA, paired ‐sample t tests, frequency analysis, chi ‐ square tests, and linear regression Large sample size provides sta ti s ti cal power and reliability; focus on a single sport may limit generalizability Shilbury & Ferkins (2011) Qualita ti ve ac ti on research combined with literature integra ti on Empirical data from a larger ac ti on research study of New Zealand na ti onal sport organiza ti on boards Thema ti c analysis Provides empirical insights into the strategic func ti oning of sport governance boards; context ‐specific Wicker & Breuer (2011) Quan ti ta ti ve, large ‐ scale survey study with longitudinal component Cross ‐sec ti onal survey (2007, n = 13.068 clubs) and longitudinal survey (2005 ‐2007, n = 1.648 clubs) Step 1: cross ‐sec ti onal (subjec ti ve scarcity measure); step 2: detailed analysis of each capacity dimension; step 3: objec ti ve scarcity measure through longitudinal indexes comparing 2005 – 2007, tested for sta ti s ti cal significance with paired t ‐tests Massive sample provides representa ti veness: longitudinal period limited to two years Table 1: Integrati ve mapping of influen ti al sport management studies to method choices Note. CFA = confirmatory factor analysis; SEM = structural equa ti on modeling; MANOVA = mul ti variate analysis of variance Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 Jure Andolšek: I Have a Research Idea, Now What? A Prac ti cal Guide to Research Method Selec ti on in Sport Management 86 sues surface, and theore ti cal clarity improves. The aim is not to prescribe a single “right” design but to make method–ques ti on fit explicit, to surface constraints early, and to document choices in ways that strengthen credibility and usefulness for sport organiza ti ons. Step 1: Align with the research ques ti on The starti ng point is conceptual, not technical: clarifying what kind of claim the study must support. If the primary aim is to test rela ti onships or e ffects; for example, whether perceived sponsor–team fit increases purchase inten ti on, then a quan ti ta ti ve design that es ti mates the size and direc ti on of ef ‐ fects with appropriate controls is usually warranted (Black, 1999). If the aim is to understand processes or meanings: how governance reforms unfold in a na ti onal sport organiza ti on or why fans co ‐create parti cular narra ti ves around a club, then a qualita ‐ ti ve design that traces mechanisms and interpreta ‐ ti ons is more suitable (Skinner et al., 2020). Many sport management problems demand both: we need es ti mates of pa tt erned rela ti onships that gen ‐ eralize and thick explana ti ons of how and why those rela ti onships arise in par ti cular se tti ngs. In such cases, mixed ‐methods designs can integrate com ‐ plementary strands within a single, coherent pro ‐ gram of inquiry (Venkatesh et al., 2013). Framing the ques ti on also means tying it explic ‐ itly to theory. In consumer and fan research, for ex ‐ ample, the Psychological Con ti nuum Model (Funk & James, 2001) and iden ti fica ti on–loyalty frameworks specify mechanisms that translate into testable paths for SEM or regression. In governance and leadership, theories of board capability or organiza ‐ ti onal learning (Klarner et al., 2021) mo ti vate pro ‐ cession ques ti ons be tt er served by case study or ethnography (Hammersley & Atkinson, 2019). When researchers anchor their ques ti ons in theory, design becomes cumula ti ve rather than ad hoc: constructs are clearer, rival explana ti ons can be specified, and appropriate evidence standards follow from the conceptual claims. Two prac ti cal heuris ti cs help at this step. First, ask whether the claim is causal, associa ti onal, or in ‐ terpre ti ve. Causal claims require designs that jus ti fy counterfactual inferences (experiments, strong quasi ‐experiments, or longitudinal cross ‐lagged models; Shadish, 2002). Associa ti onal claims can be addressed with cross ‐sec ti onal models if measure ‐ ment is sound and language stays non ‐causal. Inter ‐ pre ti ve claims priori ti ze context, meaning, and mechanism, and they rely on transparent qualita ti ve procedures for credibility (Lincoln, 1985; Maxwell, 2013; Braun & Clarke, 2006). Second, specify who/what/when/where with precision. A ques ti on such as “What drives renewal?” can be sharpened to “Among season ‐ti cket holders with at least two years of tenure (who), which aspects of perceived fairness in pricing communica ti ons (what) predict renewal in the next cycle (when) controlling for per ‐ formance and seat loca ti on (where)?” Such sharp ‐ ening naturally points to feasible data, models, and, if needed, qualita ti ve follow ‐ups to understand anomalies. Step 2: Consider data availability in sport se tti ngs Method–ques ti on fit is constrained and en ‐ abled by what data exist and can be ethically ac ‐ cessed. Sport organiza ti ons sit on rich stores of archival and transac ti onal data and, in high ‐pressure se tti ngs, wearable and biometric streams (Andrew et al., 2019). These sources can support panel mod ‐ els, event ‐history analyses of churn, and quasi ‐ex ‐ perimental designs that leverage staggered rollouts or natural experiments. When customer ‐level link ‐ age is possible, longitudinal modeling and segmen ‐ ta ti on become realis ti c; when only aggregate data are available, ti me ‐series or di fference ‐in ‐di ffer ‐ ences at the unit level (e.g., game or month) may be feasible. Where archival data are not accessible, well ‐designed primary data collec ti on becomes the backbone of the design. Early, candid conversa ti ons with clubs, leagues, and na ti onal sport organiza ti ons are crucial to match organiza ti onal u ti lity and research rigor . Gatekeepers may constrain sampling frames (e.g., only email sub ‐ scribers), impose ti ming windows (e.g., off‐season only), or request limits on experimental manipula ‐ ti ons. Researchers should inventory feasible data sources, iden ti fy what can be linked (and at what level), document data quality (coverage, missing ‐ ness, measurement issues), and an ti cipate access Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 87 failures by preparing fallback designs. Transparency about these reali ti es strengthens the credibility of inferences and signals respect for organiza ti onal partners. This step also includes assessing digital traces. Social media and app analy ti cs can capture revealed engagement, complemen ti ng self ‐reports and en ‐ abling convergent valida ti on. However, pla tf orm metrics can be vola ti le and proprietary (Verbeij et al., 2022); researchers should document how met ‐ rics are defined, whether algorithms changed during observa ti on, and how such changes were handled analy ti cally. When combining digital traces with sur ‐ vey or experimental data, plan the integra ti on from the start (e.g., unique tokens to link responses to behavior, within contractual limits and with in ‐ formed consent). Step 3: Balance philosophical stance and prac ti cal constraints In a post ‐posi ti vist view, priority is given to hy ‐ pothesis tes ti ng, sta ti s ti cal control, and approximate causal explana ti on, which aligns with experiments, quasi ‐experiments, and longitudinal models. An in ‐ terpre ti vist/construc ti vist stance privileges meaning and context, aligning with ethnography, case stud ‐ ies, and in ‐depth interviewing. A pragma ti c stance legi ti mizes mixed methods, selec ti ng tools that best answer the ques ti on given constraints (Creswell & Creswell, 2017; Bryman, 2016). Making this stance explicit strengthens coherence between ques ti ons, evidence standards, and analy ti c choices. Alongside stance, researchers must weigh prac ‐ ti cal constraints (e.g. ti me, budget, access, sta ff skills, organiza ti onal risk appe ti te) and choose de ‐ signs that are both rigorous and feasible. A random ‐ ized controlled field experiment on pricing communica ti ons may be ideal but infeasible if a club is unwilling to randomize renewal emails; a quasi ‐ experiment that exploits a phased rollout or an A/B test in a subset of channels may be acceptable and s ti ll improve causal leverage. If repeated measures are impossible, researchers can miti gate limita ti ons in cross ‐sec ti onal surveys by designing for common method variance reduc ti on (proximal/psychological separa ti on, varied scale formats), including marker variables, and, where feasible, collec ti ng multi‐ source outcomes (Podsako ff et al., 2003). In quali ‐ ta ti ve projects, if prolonged ethnography is infeasi ‐ ble, a mul ti ple ‐case design with purposeful sampling and replica ti on logic can s ti ll yield robust process explana ti ons (Eisenhardt, 1989). Skill sets also ma tt er . SEM requires competence in model specifica ti on, iden ti fica ti on, and diagnos ti cs (Kline, 2023; Hair, 2009); event ‐history modeling and panel data require econometric exper ti se (Box ‐Stef ‐ fensmeier & Jones, 2004); ethnography demands re ‐ flexivity and disciplined fieldwork (Hammersley & Atkinson, 2019). If the research team lacks a cri ti cal skill, collabora ti on or training is preferable to forcing a method ill ‐suited to the team’s capacity. Reviewers and editors in sport management increasingly re ‐ ward designs that are well executed over those that are merely fashionable. Step 4: Ethical considera ti ons specific to sport Privacy and informed consent are paramount when studies involve ti cke ti ng records and wearable or biometric data (Osborne, 2017). Researchers should adopt data minimiza ti on, store iden ti fiable data securely, and obtain informed consent propor ‐ ti onate to the sensi ti vity of the data and the risks in ‐ volved. Where contracts restrict data sharing, researchers can s ti ll enhance transparency by shar ‐ ing synthe ti c codebooks, analysis code, and de ‐iden ‐ ti fied outputs consistent with agreements (Bai & Bai, 2021). Athletes, volunteers, junior employees, and even fans may feel obligated to par ti cipate, par ti c ‐ ularly when studies are brokered by the organiza ‐ ti on. Protocols should provide independent consent channels, assure par ti cipants that non ‐par ti cipa ti on has no consequences, and allow withdrawal without penalty. In qualita ti ve work, researchers must be vigilant about confiden ti ality in small communi ti es where roles are iden ti fiable; plans for disguising cases and removing indirect iden ti fiers should be set in advance. Reputa ti onal risk is acute in governance, integrity, or safeguarding work. Designs should an ‐ ti cipate the poten ti al for harm to individuals and or ‐ ganiza ti ons. Data handling, anonymiza ti on, and repor ti ng conven ti ons should be agreed with part ‐ ners before data collec ti on (Oetzel & Spikermann, Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 Jure Andolšek: I Have a Research Idea, Now What? A Prac ti cal Guide to Research Method Selec ti on in Sport Management 88 2014). Researchers should also plan for adverse findings: if crises or misconduct are uncovered, the protocol must specify how informa ti on will be han ‐ dled, consistent with legal and ethical obliga ti ons. Finally, sport research o ft en involves minors and vulnerable groups (youth athletes; para ‐sport). Tailored consent/assent procedures, addi ti onal pro ‐ tec ti ons, and, where required, external ethical ap ‐ provals are mandatory. The ethical stance should be integrated into design decisions from the outset, not appended as a compliance step. In qualita ti ve projects, reflexive memos can document ethical de ‐ cision points; in quan ti ta ti ve projects, preregistra ‐ ti on can clarify analy ti c inten ti ons without disclosing proprietary data (Creswell & Creswell, 2017; Lincoln, 1985). Step 5: Integra ti on and innova ti on The final step is forward ‐looking: choosing methods that not only answer the immediate ques ‐ ti on but also advance cumula ti ve knowledge and in ‐ form decisions in sport organiza ti ons. Integra ti on can proceed in either direc ti on. Qualita ti ve insights can be translated into variables and hypotheses for quan ti ta ti ve tes ti ng (e.g., codes on fairness narra ‐ ti ves become survey items and experimental manip ‐ ula ti ons). Quan ti ta ti ve pa tt erns can guide purposive sampling for qualita ti ve follow ‐up (e.g., interviewing “defectors” who report high sa ti sfac ti on but do not renew). True integra ti on occurs at interpreta ti on, where strands are brought together to generate meta ‐inferences that neither strand could support alone (Venkatesh et al., 2013). Innova ti on o ft en means deploying underused designs that fit sport’s data reali ti es. Longitudinal/ ‐ panel models can track loyalty trajectories, sepa ‐ rate state from trait e ffects, and test cross ‐lagged rela ti ons between iden ti fica ti on and behavior. Event ‐history models can es ti mate hazard rates for churn and iden ti fy ti me ‐varying covariates linked to reten ti on (Box ‐Ste ffensmeier & Jones, 2004). Field experiments can test pricing and messaging at scale with minimal disrup ti on, provided randomiza ti on is ethically and opera ti onally acceptable (Shadish, 2002). Digital ethnography/netnography can un ‐ cover norms in online fan communi ti es that shape advocacy and resistance (Fenton & Parry, 2022). Clustering and segmenta ti on can be applied trans ‐ parently to inform targeted ac ti va ti on (Nur & Sire ‐ gar, 2024), avoiding the opacity of purely black ‐box models. These innova ti ons are not ends in them ‐ selves; they are means to sharpen inference and prac ti cal relevance. Transparent repor ti ng underpins cumula ti ve progress. Quan ti ta ti ve studies should report sam ‐ pling frames, response rates, measurement proper ‐ ti es (reliability; convergent/discriminant validity), model fit, robustness checks, and, when applicable, measurement invariance across groups (Fornell & Larcker, 1981; Henseler et al., 2015). Where feasible, preregistered analysis plans can reduce researcher degrees of freedom and clarify confirmatory versus exploratory components. Qualita ti ve studies should specify sampling ra ti onale, access, researcher posi ‐ ti onality, coding procedures, theme development, and strategies for credibility (Braun & Clarke, 2006;). Mixed ‐methods studies should present joint displays that align quan ti ta ti ve results and qualita ti ve themes, make the logic of integra ti on visible, and discuss convergence and divergence explicitly (Venkatesh et al., 2013). Even when data cannot be posted, sharing instruments, codebooks, and analy ‐ sis code (with simulated data where necessary) en ‐ hances reproducibility. 4 Discussion This paper set out to close a persistent gap in sport management: the misalignment between the ques ti ons scholars and practiti oners actually ask and the designs most commonly used to answer them. Drawing on established methodological foun ‐ da ti ons and domain exemplars, we argued for method–ques ti on fit as the organizing principle of design in sport, and we proposed a five ‐step, sport ‐ specific framework to make that fit explicit, ethical, and feasible. In this discussion, we synthesize where the field stands, highlight underused opportuni ti es that match sport’s data reali ti es, and clarify the con ‐ tribu ti ons of this ar ti cle, both prac ti cal and theore ti ‐ cal. We close by acknowledging limita ti ons and outlining a future research agenda that can acceler ‐ ate cumula ti ve, credible knowledge produc ti on. Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 89 4.1 Underused opportuni ti es: broadening the repertoire without breaking feasibility The methodological center of gravity in sport management remains cross ‐sec ti onal surveys ana ‐ lyzed with regression or SEM, complemented by qualita ti ve case work in governance and leadership. That center has yielded durable measurement tra ‐ di ti ons and mid ‐range theory around mo ti va ti on, iden ti fica ti on, perceived value, brand associa ti ons, service quality, and sponsorship mechanisms (Trail & James, 2001; Greenwell et al., 2002; Yoshida & James, 2010; Kunkel et al., 2013). Yet much of what ma tt ers to organiza ti ons is dynamic, contextual, and mul ti level and thus poorly served by one ‐wave self ‐reports. Four opportunity spaces deserve em ‐ phasis. Temporal designs (1). Loyalty development, re ‐ newal, and sponsorship ROI unfold over ti me; so do governance reforms and culture change. Longitudi ‐ nal panels and cross ‐lagged models can adjudicate direc ti onality claims that cross ‐sec ti onal SEM can ‐ not (Kline, 2023; Hair, 2009). Where customer ‐level linkage is possible, event ‐history (survival) models can es ti mate churn hazards and ti me ‐varying covari ‐ ates, a natural fit for ti cke ti ng (Box ‐Ste ffensmeier & Jones, 2004). Field and quasi ‐experiments (2). A/B tests embedded in rou ti ne communica ti ons (email, app, social) can evaluate message framing, sponsor ‐ ship disclosure, or price fairness cues at scale; where randomiza ti on is not possible, staggered roll ‐outs and other quasi ‐experimental strategies can mean ‐ ingfully improve causal leverage (Shadish, 2002). These designs align with opera ti onal rhythms and risk tolerances of clubs and leagues. Digital ethnography and netnography (3). Fan communi ti es are partly cons ti tuted online; ethno ‐ graphic approaches can surface the norms and in ‐ formal governance that shape co ‐crea ti on, advocacy, and resistance, providing mechanisms that complement quan ti ta ti ve pa tt erns (Hammers ‐ ley & Atkinson, 2019; Filo et al., 2015). Linking these qualita ti ve insights to behavioral analy ti cs strength ‐ ens both explana ti on and predic ti on. Archival and administra ti ve data (4). Econometric analyses of at ‐ tendance, membership tenure, facility usage, and funding (o ft en held by clubs, leagues, or municipal ‐ i ti es) extend external validity and reduce sole re ‐ liance on self ‐report (Wicker & Breuer, 2011). With careful governance and privacy protec ti on, these sources can be integrated into mixed designs. 4.2 Theore ti cal contribu ti ons This paper makes three connected contribu ‐ ti ons about how evidence should warrant claims in sport management. First, we o ffer a sport ‐specific eviden ti ary logic that ti es claim types to appropriate designs. We dis ti nguish causal, associa ti onal, and in ‐ terpre ti ve claims and specify minimal adequate de ‐ signs given sport’s data reali ti es. Causal asser ti ons (e.g., e ffects of ac ti va ti on framing or renewal com ‐ munica ti ons) call for randomized or strong quasi ‐ex ‐ perimental designs, or longitudinal models that establish temporal precedence (Shadish, 2002; Kline, 2023). Associa ti onal claims (e.g., iden ti fica ti on ↔ word ‐of ‐mouth) can rely on cross ‐sec ti onal regres ‐ sion/SEM if measurement is rigorous and language remains non ‐causal (Hu & Bentler, 1999; Hair, 2009). Interpre ti ve claims (e.g., how board capability emerges; how fan communi ti es co ‐create meaning) are best warranted through transparent qualita ti ve designs (Lincoln, 1985; Braun & Clarke, 2006; Miles et al., 2014). The novelty lies in contextualizing this mapping for sport: separa ti ng selec ti on from persua ‐ sion in sponsorship becomes a design choice (exper ‐ iment/panel), and dis ti nguishing sa ti sfac ti on ‐driven renewal from structural iner ti a points to event ‐his ‐ tory modeling (Box ‐Ste ffensmeier & Jones, 2004). Second, we provide an integra ti on blueprint that composes variance and process explana ti ons across levels typical in sport. Quan ti ta ti ve models delimit the space of plausible mechanisms and es ‐ ti mate for whom/how much; qualita ti ve analyses reveal the sequences, rou ti nes, and meanings through which e ffects are produced or blocked; mixed methods coordinate both to yield meta ‐infer ‐ ences that travel further than either strand alone (Johnson & Onwuegbuzie, 2004; Venkatesh et al., 2013). Our five ‐step framework opera ti onalizes this by planning integra ti on at design ti me (e.g., sam ‐ pling quan ti ta ti ve “outliers” for interview follow ‐ ups; building survey items from qualita ti ve codes; using joint displays), so theories accrue as linked variance–process proposi ti ons rather than parallel narra ti ves. Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 Jure Andolšek: I Have a Research Idea, Now What? A Prac ti cal Guide to Research Method Selec ti on in Sport Management 90 Third, we advance a reliability and transporta ‐ bility charter suited to proprietary, access ‐con ‐ strained sport contexts. Instead of unrealis ti c “share everything” prescrip ti ons, we specify prac ti ces that raise credibility: preregistra ti on where feasible; ex ‐ plicit repor ti ng of sampling frames and data gover ‐ nance; instrument and code sharing with synthe ti c data; measurement invariance checks for group comparisons (Henseler et al., 2015); and qualita ti ve audit trails that protect iden titi es (Miles et al., 2014). We also foreground transportability; arguing, with evidence, what is likely to generalize across clubs, leagues, and countries and what is local. 4.3 Prac ti cal implica ti ons For sport organiza ti ons, this ar ti cle’s mapping from ques ti on types to feasible designs translates di ‐ rectly into be tt er decision ‐making. Marke ti ng and ti cke ti ng teams can priori ti ze field experiments and event ‐history models to op ti mize renewal messaging and reduce churn; sponsorship units can combine A/B tests of ac ti va ti on framing with survey ‐based SEM to separate persuasion from selec ti on; commu ‐ nity and par ti cipa ti on programs can use longitudinal tracking to evidence impact rather than relying on one ‐o ff sa ti sfac ti on polls. Governance and HR lead ‐ ers can commission qualita ti ve case work to diag ‐ nose capability, culture, and psychological safety before scaling changes. The five ‐step framework also clarifies data governance and ethics by design (e.g., informed consent for wearable data, independent opt ‐outs for athletes and volunteers), helping clubs and federa ti ons align legal, reputa ti onal, and analy ti ‐ cal considera ti ons early. For researchers and graduate programs, the framework o ffers a curriculum and workflow up ‐ grade. Methods teaching should move beyond tool proficiency toward method–ques ti on fit, adding prac ti cal modules on partner nego ti a ti ons, prereg ‐ istra ti on, measurement invariance, CAQDAS ‐sup ‐ ported analysis, joint displays, and reproducible code sharing (with synthe ti c data when required). When pursuing club or federa ti on partnerships, scholars can use the framework to set realis ti c se ‐ quencing (e.g., survey → panel → experiment), to document trade ‐o ffs between rigor and access, and to ensure transparent repor ti ng that meets journal standards even under proprietary constraints. De ‐ partments and centers can ins ti tu ti onalize impact by hos ti ng instrument/code repositories, ethical templates, and mixed ‐methods exemplars specific to sport. The net e ffect is a por tf olio of studies that are more causally credible, contextually insigh tf ul, and ac ti onable. 4.4 Limita ti ons and future research ideas This ar ti cle is inten ti onally pragma ti c rather than exhaus ti ve. Our synthesis draws on influen ti al exemplars and widely used methodological texts to build a sport ‐specific logic of evidence, but it is not a systema ti c review of every subdomain. As a re ‐ sult, niche areas (e.g., esports governance, para ‐ sport par ti cipa ti on, women’s professional leagues) may involve constraints or opportuni ti es that di ffer from those highlighted here. A second limita ti on is that we do not empirically test the five ‐step frame ‐ work; its value is norma ti ve and organizing. Finally, sport systems vary widely in legal regimes, data in ‐ frastructures, and governance models; what counts as feasible (e.g., randomiza ti on, customer ‐level linkage) in one league may be unrealis ti c else ‐ where. Researchers should therefore treat the framework as a sca ffold to adapt, not a template to apply mechanically. Future work should evaluate the framework in prac ti ce. One promising path is to run design ‐reg ‐ istered “method deployments” in partnership with clubs or federa ti ons: teams would prospec ti vely apply the five steps, preregister designs where fea ‐ sible, and then report feasibility, partner u ti lity, and eviden ti ary quality (e.g., causal leverage, trans ‐ portability). Compara ti ve work could test the same ques ti on, such as season ‐ti cket renewal or sponsor ‐ ship ac ti va ti on, under alterna ti ve designs including cross‐sec ti onal SEM, panel studies, and field exper ‐ iments, and across di fferent leagues and countries. Meta ‐science audits of published sport manage ‐ ment studies that track repor ti ng of psychometrics, measurement invariance, remedies for common method variance, qualita ti ve trustworthiness, and the integra ti on of mixed methods would help cali ‐ brate journal standards and reveal persistent gaps that training or guidelines should address. Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 91 A second strand should build infrastructure for cumula ti ve work. Priori ti es include shared instru ‐ ment repositories with documented psychometrics and invariance proper ti es; open, well ‐annotated analysis code (paired with synthe ti c datasets when raw data cannot be shared); template agreements for ethical data governance with sport organiza ti ons; and exemplars of joint displays and meta ‐inferences to normalize strong mixed ‐methods prac ti ce. Sub ‐ stan ti vely, we encourage more longitudinal and ex ‐ perimental programs on loyalty trajectories, churn, and pricing fairness; event ‐history and panel models that integrate archival and behavioral data; and dig ‐ ital ethnography/netnography that links community norms to measurable engagement. Cross ‐league and cross ‐culture comparisons should explicitly test transportability and boundary condi ti ons, while par ‐ ti cipatory and co ‐design approaches with athletes, fans, and sta ff can surface ethical and practi cal con ‐ straints early, improving both rigor and relevance. 5 CONCLUSION This paper has argued that advancing sport management scholarship and prac ti ce depends less on adding methods to our toolkit than on achieving method–ques ti on fit within the reali ti es of sport or ‐ ganiza ti ons. By mapping claim types to appropriate designs, consolida ti ng sport ‐specific guardrails for quality, and proposing a five ‐step, design frame ‐ work, we o ffer a prac ti cal route from research ideas to defensible studies that generate credible, deci ‐ sion ‐relevant evidence. The review of dominant ap ‐ proaches and exemplars shows where current strengths lie and where temporal, experimental, ethnographic, and integrated designs can li ft the ev ‐ iden ti ary bar. The task now is implementa ti on. Researchers should begin with concise design briefs, nego ti ate access that enables longitudinal, experimental, or mixed ‐methods work where warranted, and report transparently so findings travel across clubs, leagues, and cultures. Organiza ti ons and journals can accel ‐ erate this shi ft by rewarding fit ‐for ‐purpose designs, establishing clear data ‐governance pathways, and normalizing open materials (instruments, code, syn ‐ the ti c data) when full sharing is impossible. EXTENDED SUMMARY/IZVLE ČEK Raziskave na podro čju športnega managementa pogosto temeljijo na prese čnih anketah, anal ‐ iziranih z regresijo ali modeliranjem strukturnih ena čb, tudi kadar raziskovalna vprašanja in po ‐ datkovna okolja zahtevajo časovne, eksperimentalne, etnografske ali integrirane raziskovalne zasnove. Ta članek obravnava to neusklajenost z uveljavljanjem na čela ujemanja metode in razisko ‐ valnega vprašanja kot osrednjega vodila pri raziskovanju v športnem managementu. V prispevku (a) sinte ti ziramo, kako se v športu dejansko uporabljajo kvan ti ta ti vne, kvalita ti vne in mešane metode ter kje ima vsaka svoje prednos ti ; (b) predstavljamo petstopenjski, za šport specifi čen odlo čitveni okvir (uskladitev vprašanja in teorije, presoja podatkov in dostopa, uravnoteženje epistemologije in izvedljivos ti , e tič no na črtovanje raziskave ter integracija metod za inova ti vnost); in (c) združujemo temeljne smernice za kakovost (psihometri čne lastnos ti , prileganje in invarianco modelov, verodos ‐ tojnost kvalita ti vnih raziskav ter integracijo mešanih metod). Teore tič no prispevek oblikuje športno specifi čno dokazno logiko, integracijski na črt, ki povezuje pojasnjevanje variance in procesov, ter okvir za zanesljivost in prenosljivost, prilagojen lastniškim podatkovnim okoljem. Zaklju čujemo s prak tič nimi implikacijami za management v športu in usposabljanje ter s predlogom raziskovalne agende, ki poudarja longitudinalne, eksperimentalne, etnografske in mešane raziskovalne programe. Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 Jure Andolšek: I Have a Research Idea, Now What? A Prac ti cal Guide to Research Method Selec ti on in Sport Management 92 REFERENCES Andrew, D. P ., Pedersen, P . M., & McEvoy, C. D. (2019). Re ‐ search methods and design in sport management. Human Kine ti cs. Bai, Z., & Bai, X. (2021). Sports big data: management, analysis, applica ti ons, and challenges. Complex ‐ ity, 2021(1), 6676297. Biscaia, R., Correia, A., Rosado, A. F ., Ross, S. D., & Maroco, J. (2013). Sport sponsorship: The rela ti onship between team loyalty, sponsorship awareness, a tti tude toward the sponsor, and purchase inten ti ons. Journal of sport management, 27(4), 288 ‐302. Black, T. R. (1999). Doing quan ti ta ti ve research in the so ‐ cial sciences: An integrated approach to research de ‐ sign, measurement and sta ti s ti cs. sage. Box ‐Ste ffensmeier, J. M., & Jones, B. S. (2004). Event his ‐ tory modeling: A guide for social scien ti sts. Cambridge University Press. Braun, V ., & Clarke, V . (2006). Using thema ti c analysis in psy ‐ chology. Qualita ti ve research in psychology, 3(2), 77 ‐101. Bryman, A. (2016). Social research methods. Oxford uni ‐ versity press. Christensen, R. (1997). Log ‐linear models and logis ti c re ‐ gression. New York, NY: Springer New York. Cornwell, T. B. (2013). State of the art and science in sponsorship ‐linked marke ti ng. Handbook of research on sport and business, 456 ‐476. Creswell, J. W., & Creswell, J. D. (2017). Research design: Qualita ti ve, quan ti ta ti ve, and mixed methods ap ‐ proaches. Sage publica ti ons. DeVellis, R. F., & Thorpe, C. T . (2021). Scale development: Theory and applica ti ons. Sage publica ti ons. Doherty, A. (2013). “It takes a village:” Interdisciplinary research for sport management. Journal of Sport Management, 27(1), 1 ‐10. Eisenhardt, K. M. (1989). Building theories from case study research. Academy of management re ‐ view, 14(4), 532 ‐550. Fenton, A., & Parry, K. D. (2022). Netnography: An ap ‐ proach to ethnography in the digital age. The SAGE handbook of social media research methods, 214 ‐227. Field, A. (2024). Discovering sta ti s ti cs using IBM SPSS sta ti s ti cs. Sage publica ti ons limited. Filo, K., Lock, D., & Karg, A. (2015). Sport and social media research: A review. Sport management review, 18(2), 166 ‐181. Fischer, H. E., Boone, W . J., & Neumann, K. (2023). Quan ‐ ti ta ti ve research designs and approaches. In Hand ‐ book of research on science educa ti on (pp. 28 ‐59). Routledge. Fornell, C., & Larcker, D. F . (1981). Evalua ti ng structural equa ‐ ti on models with unobservable variables and measure ‐ ment error . Journal of marke ti ng research, 18(1), 39 ‐50. Funk, D. C., & James, J. (2001). The psychological con ti n ‐ uum model: A conceptual framework for understand ‐ ing an individual’s psychological connec ti on to sport. Sport management review, 4(2), 119 ‐150. Greenwell, T. C., Fink, J. S., & Pastore, D. L. (2002). As ‐ sessing the influence of the physical sports facility on customer sa ti sfac ti on within the context of the ser ‐ vice experience. Sport Management Review, 5(2), 129 ‐148. Gwinner, K., & Benne tt , G. (2008). The impact of brand cohesiveness and sport iden ti fica ti on on brand fit in a sponsorship context. Journal of Sport Manage ‐ ment, 22(4), 410 ‐426. Hair, J. F. (2009). Mul ti variate data analysis. Hammersley, M., & Atkinson, P. (2019). Ethnography: Principles in prac ti ce. Routledge. Henseler, J., Ringle, C. M., & Sarstedt, M. (2015). A new criterion for assessing discriminant validity in vari ‐ ance ‐based structural equa ti on modeling. Journal of the academy of marke ti ng science, 43(1), 115 ‐135. Hinkin, T. R. (1998). A brief tutorial on the development of measures for use in survey ques ti onnaires. Orga ‐ niza ti onal research methods, 1(1), 104 ‐121. Hu, L. T., & Bentler, P . M. (1999). Cuto ff criteria for fit in ‐ dexes in covariance structure analysis: Conven ti onal criteria versus new alterna ti ves. Structural equa ti on modeling: a mul ti disciplinary journal, 6(1), 1 ‐55. Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed meth ‐ ods research: A research paradigm whose ti me has come. Educa ti onal researcher, 33(7), 14 ‐26. Klarner, P., Yoshikawa, T., & Hi tt , M. A. (2021). A capabil ‐ ity ‐based view of boards: A new conceptual frame ‐ work for board governance. Academy of Management Perspec ti ves, 35(1), 123 ‐141. Kline, R. B. (2023). Principles and prac ti ce of structural equati on modeling. Guilford publica ti ons. Kunkel, T., Funk, D., & Hill, B. (2013). Brand architecture, drivers of consumer involvement, and brand loyalty with professional sport leagues and teams. Journal of Sport Management, 27(3), 177 ‐192. Leech, N. L., & Onwuegbuzie, A. J. (2011). Beyond con ‐ stant comparison qualita ti ve data analysis: Using NVivo. School psychology quarterly, 26(1), 70. Lincoln, Y . S. (1985). Naturalis ti c inquiry (Vol. 75). sage. Maxwell, J. A. (2013). Qualita ti ve research design: An in ‐ terac ti ve approach: An interac ti ve approach. sage. Miles, M. B., Huberman, A. M., & Saldana, J. (2014). Qual ‐ ita ti ve data analysis: A methods sourcebook. Sage publica ti ons. Morse, A. L., & McEvoy, C. D. (2014). Qualita ti ve research in sport management: Case study as a methodological approach. The Qualita ti ve Report, How to Ar ti ‐ cle, 19(17), 1 ‐13. Dynamic Rela ti onships Management Journal, Vol. 14, No. 2, November 2025 93 Nick, T . G. (2007). Descrip ti ve sta ti s ti cs. Topics in biosta ti s ‐ ti cs, 33 ‐52. Nur, M. F., & Siregar, A. (2024). Exploring the use of clus ‐ ter analysis in market segmenta ti on for targeted ad ‐ ver ti sing. IAIC Transac ti ons on Sustainable Digital Innova ti on (ITSDI), 5(2), 158 ‐168. Oetzel, M. C., & Spiekermann, S. (2014). A systema ti c methodology for privacy impact assessments: a de ‐ sign science approach. European Journal of Informa ‐ ti on Systems, 23(2), 126 ‐150. Olson, E. L., & Thjømøe, H. M. (2011). Explaining and ar ‐ ti cula ti ng the fit construct in sponsorship. Journal of Adver ti sing, 40(1), 57 ‐70. Osborne, B. (2017). Legal and ethical implica ti ons of ath ‐ letes’ biometric data collec ti on in professional sport. MArq. sports L. rev., 28, 37. Pa tt on, M. Q. (2015). Qualita ti ve research & evalua ti on methods: Integra ti ng theory and prac ti ce. Sage pub ‐ lica ti ons. Podsako ff, P . M., MacKenzie, S. B., Lee, J. Y ., & Podsako ff, N. P. (2003). Common method biases in behavioral re ‐ search: a cri ti cal review of the literature and recom ‐ mended remedies. Journal of applied psychology, 88(5), 879. Robertson, S., Kremer, P., Aisbe tt , B., Tran, J., & Cerin, E. (2017). Consensus on measurement proper ti es and fea ‐ sibility of performance tests for the exercise and sport sciences: a Delphi study. Sports medicine ‐open, 3(1), 2. Rudd, A., & Johnson, R. B. (2010). A call for more mixed methods in sport management research. Sport Man ‐ agement Review, 13(1), 14 ‐24. Rutherford, A. (2011). ANOVA and ANCOVA: a GLM ap ‐ proach (Vol. 658). Hoboken, NJ: Wiley. Shadish, W . R. (2002). Experimental and quasi ‐experimen ‐ tal designs for generalized causal inference. Wadsworth Cengage Learning. Shilbury, D., & Ferkins, L. (2011). Professionalisa ti on, sport governance and strategic capability. Managing leisure, 16(2), 108 ‐127. Skinner, J., Edwards, A., & Smith, A. C. (2020). Qualita ti ve research in sport management. Routledge. Smith, A. C., & Stewart, B. (2013). The special features of sport: A cri ti cal revisit. In Handbook of research on sport and business (pp. 526 ‐547). Edward Elgar Pub ‐ lishing. Tashakkori, A., & Teddlie, C. (Eds.). (2010). Sage handbook of mixed methods in social & behavioral research. sage. Terry, G., Hayfield, N., Clarke, V., & Braun, V. (2017). The ‐ ma ti c analysis. The SAGE handbook of qualita ti ve re ‐ search in psychology, 2(17 ‐37), 25. Trail, G. T ., & James, J. D. (2001). The mo ti va ti on scale for sport consump ti on: Assessment of the scale’s psycho ‐ metric proper ti es. Journal of sport behavior, 24(1). Veal, A. J., & Darcy, S. (2014). Research methods in sport studies and sport management: A prac ti cal guide. Routledge. Venkatesh, V., Brown, S. A., & Bala, H. (2013). Bridging the qualita ti ve ‐quan ti ta ti ve divide: Guidelines for conduc ti ng mixed methods research in informa ti on systems. MIS quarterly, 21 ‐54. Verbeij, T., Pouwels, J. L., Beyens, I., & Valkenburg, P . M. (2022). Experience sampling self ‐reports of social media use have comparable predic ti ve validity to dig ‐ ital trace measures. Scien ti fic Reports, 12(1), 7611. Washington, M., & Pa tt erson, K. D. (2011). Hos ti le takeover or joint venture: Connec ti ons between ins ti tu ti onal theory and sport management research. Sport man ‐ agement review, 14(1), 1 ‐12. Wicker, P., & Breuer, C. (2011). Scarcity of resources in German non ‐profit sport clubs. Sport management review, 14(2), 188 ‐201. Wicker, P., Hallmann, K., & Breuer, C. (2013). Analyzing the impact of sport infrastructure on sport par ti cipa ‐ ti on using geo ‐coded data: Evidence from mul ti‐ level models. Sport management review, 16(1), 54 ‐67. Yin, R. K. (2018). Case study research and applica ti ons (Vol. 6). Thousand Oaks, CA: Sage. Yoshida, M., & James, J. D. (2010). Customer sa ti sfac ti on with game and service experiences: Antecedents and consequences. Journal of sport management, 24(3), 338 ‐361. Yuan, K. H., & Bentler, P . M. (2006). 10 structural equa ti on modeling. Handbook of sta ti s ti cs, 26, 297 ‐358.