Abstract
This study examined whether professional development can translate a signed literacy instruction framework into high-fidelity classroom practice. Four K-3 teachers in an ASL/English bilingual program received a 2-day training followed by bi-weekly virtual coaching; entry into training was staggered in a non-concurrent multiple-baseline single case design. Teachers’ use of indicators of signed literacy instruction was scored for at least 7 baseline and 2 intervention instructional units per teacher. Visual analysis showed near-zero fidelity during baseline and an immediate jump to 45%–60% on the first post-training unit, with 3 teachers accelerating to 80%−90% and 1 maintaining at 60%. Log response-ratio effect sizes ranged from 2.04 to 3.72, confirming large, consistent gains. Social validity interviews indicated that teachers valued the instructional framework, found it feasible, and planned to expand its use the following year. These findings show a functional relationship between SISI professional development and teachers’ implementation of signed literacy instruction. They emphasize the value of professional development and suggest that signed literacy instruction can be embedded in early elementary curricula to promote deaf children’s signing skills.
Keywords: deaf education, sign language, single case methods
Imagine an elementary classroom in the United States where deaf children spend hours each day learning about English, its grammar, structure, and literature but know little about American Sign Language (ASL) or its linguistic features and literary forms. Many deaf education programs use ASL as a language of instruction to teach English or other content areas, but few of them provide systematic instruction targeting signed literacy (Holcomb & Eberwein, 2025). Signed literacy, also referred to as signacy, is the multifaceted skills individuals need to comprehend and produce ASL texts (Gibson & Byrne, 2024). Teachers report wanting to teach signed literacy but lack a clear instructional framework, along with training and coaching on how to do it effectively (Holcomb, 2024; Holcomb & Higgins, 2024; Wolbers et al., 2023). This study offers teachers that playbook, applying well-established broader literacy instructional practices such as genre study (Duthie, 1994; National Governors Association Center for Best Practices, & Council of Chief State School Officers, 2010), strategy instruction (Harris et al., 2023), and gradual release (Shanahan et al., 2010) to the signed modality. Effectively integrating a new instructional framework into everyday teaching typically depends on high-quality professional development (Darling-Hammond & Richardson, 2009). The primary aim of this study is to examine whether professional development for signed literacy instruction can transform teachers’ instructional practices in deaf education classrooms. This paper begins with a review of relevant literature informing the theoretical and empirical foundation of the study, followed by a description of the research question and methodology. Results are then presented and discussed in relation to implications for signed literacy instruction and the broader field of deaf education.
Literature review
Broader principles of literacy instruction
Research shows that students learn best when literacy instruction integrates interaction and strategic approaches in meaningful communicative contexts (Graham & Perin, 2007; Pressley & Allington, 2014; Shanahan et al., 2010). In the domain of reading and writing, teachers should provide daily, sustained opportunities for students to engage with a range of texts. These texts should serve authentic purposes, situated in real-life contexts relevant to the students’ social worlds (Graham et al., 2012; Mirra & Garcia, 2021). Thus, in effective literacy classrooms, teachers foster a collaborative community of learners where students interact as readers and writers, sharing and discussing texts from both author and audience perspectives (Graham, 2018).
Decontextualized grammar instruction
Extensive research has shown that decontextualized grammar instruction centered on isolated definitions and worksheet drills, typical of traditional literacy practices, is largely ineffective (Andrews et al., 2006; Graham et al., 2012; Graham & Perin, 2007). Instead, grammar instruction is most effective when it is woven into meaningful reading and writing experiences (Jones et al., 2013; Fearn & Farnan, 2007; Graham & Perin, 2007; Graham et al., 2012). This approach supports students in learning about the functions of grammar by considering the meanings that they convey and reflecting on how grammar shapes the message and audience understanding (Myhill et al., 2013; Myhill & Watson, 2014). For example, a meta-analysis found that when students actively reflect on the function and practical application of grammar during the writing process, especially through culturally relevant literacy instruction that values diversity in language use, they internalize and utilize new grammar skills more successfully (Graham et al., 2012; Ladson-Billings, 1992; Woodard et al., 2017). These findings point to the importance of embedding grammar instruction within authentic contexts that connect directly to students’ lived experiences, identities, and desires to communicate.
Effective instructional methods
Instructionally, teachers are most effective when they tailor support to students’ literacy needs by targeting skills within each student’s zone of proximal development (Vygotsky, 1978), which is the range of tasks a student cannot yet do independently but can accomplish with guidance from someone more knowledgeable. One widely accepted instructional framework aligned with this concept is the gradual release of responsibility model (Fisher & Frey, 2008; Pearson et al., 2019). In this model, the teacher identifies a skill within a student’s zone of proximal development, begins by explicitly modeling the target skill, then guides the student through scaffolded practice with immediate feedback, and gradually shifts responsibility toward independent application. This process gives students abundant opportunities to apply the target skill across contexts, genres, and tasks, with varying levels of support that gradually fade over time (Graham, 2018; Graham & Perin, 2007).
Another well-supported approach is strategy instruction, which can be integrated into the gradual release model. Strategy instruction focuses on helping students become aware of and take control over their own learning processes to accomplish their goals. In print literacy instruction, teachers explicitly teach and provide tools for planning, monitoring, and revising and fostering metacognitive awareness of reading and writing tasks (Harris, 2024). For example, graphic organizers are used to turn invisible thinking into something students can see, move around, and build on. Genre instruction further complements strategy instruction by helping students understand communication goals through the linguistic, social, and structural conventions of different text types, including personal narratives, informational reports, and persuasive arguments (Rose, 2018). Moreover, mentor texts can be used to highlight genre features, offering students a mental inventory of linguistic and structural possibilities (Culham, 2023). In parallel, comprehension instruction supports students in developing higher-order reading strategies such as summarizing, questioning, and making inferences about the texts they are exposed to (Pearson & Duke, 2002), which, in turn, helps them anticipate how readers might interpret their own writing. Throughout strategy instruction, students are encouraged to reflect on the author’s communicative goals, monitor the clarity of messages, and revise based on how well they think the message will resonate with the audience. In sum, students are prompted to think about their thinking as they read and compose texts, set goals for their own communication, and make adjustments to improve clarity and impact.
Composing as a core practice in literacy instruction
Composing is the backbone of effective literacy instruction. It lets students externalize their strategies, thinking, and language choices. By guiding learners through active composition, teachers cultivate the very strategies, knowledge, and skills that underlie readers and writers engaged with literacy. The final product of a composition can take many forms, including written texts, digital presentations, visual artwork, theater performances, or signed language videos, making literacy meaningful and applicable to students’ lives and communication goals.
Seminal work by Hayes and Flower in the 1980s illustrates that composing is not a one-step task. Rather, it involves recursive processes of generating ideas, organizing thoughts, translating them into a communicative form, and revising with a specific audience and purpose in mind (Hayes & Flower, 1980; Hayes & Flower, 1986; Flower & Hayes, 1980; Flower & Hayes, 1981). Composing requires students to attend to two cognitive domains: the task environment and working memory (Flower & Hayes, 1981). The task environment includes the topic, audience, genre, prompt, and social context, all of which shape the composer’s goals. Working memory, in contrast, holds the ideas, background knowledge, genre structures, and language resources students draw on in real time while composing. For example, a student composing an informational text must balance their understanding (task environment) of the intended audience (e.g., toddlers vs. adults) and their levels of knowledge of the topic with how to structure facts and supporting details (working memory) to be adequately informative and interesting. This means literacy instruction must explicitly teach strategies that help students navigate the task environment and actively engage their working memory.
Instructional methods such as the gradual release model and strategy instruction come in handy here. Through these approaches, students are supported at each step of the composing process. Graphic organizers anchored in genre expectations have been shown to be effective for college students composing persuasive texts (Limpo & Alves, 2018) and also beneficial for younger students who are still developing strategies to manage the cognitive demands of composing age-appropriate tasks such as personal narratives (Boon et al., 2018; Graham & Perin, 2007). There is widespread agreement that early composing experiences play a key role in children’s long term literacy development. More specifically, it is the combination of rich idea generation, expressive language use, and scaffolded writing activities, facilitated by structured literacy instruction in early childhood and elementary years, that supports the development of skilled readers and writers later on (Bingham et al., 2018; Quinn & Bingham, 2019; Quinn et al., 2021).
One instructional model that applies these evidence-based practices to support writing development among deaf students is Strategic and Interactive Writing Instruction (SIWI). Although SIWI targets written compositions, its framework offers a foundation for considering how similar strategies might support signed compositions. The next section outlines the SIWI approach and how this current study builds on previous research in deaf education.
The present study
Background: Strategic and Interactive Writing Instruction (SIWI)
SIWI adapts evidence-based literacy practices to teach deaf students in upper elementary and middle school. Developed in the 2000s, SIWI has consistently improved deaf students’ writing skills in numerous studies, including a recent large-scale randomized controlled trial (Wolbers et al., 2023). SIWI emphasizes strategy instruction throughout the writing process-planning, organizing, revising, and sharing texts with authentic audiences. Teachers use the gradual release model, graphic organizers, and mentor texts; as students gain proficiency, teacher support fades, building confidence and independence. SIWI is effective because it offers structured, meaningful practice that fits deaf students’ language contexts within a collaborative learning environment. Yet two limitations remain. First, SIWI treats signed language mainly as a bridge to written language and does not directly build signed literacy as a target product. Second, implementation generally begins in Grade 3, when students can produce longer written texts; this leaves younger children with limited opportunities to develop signed literacy skills during their earliest years of schooling. Consequently, there is a clear need for an approach that targets signed composition in the primary grades.
Gap addressed: Strategic and Interactive Signing Instruction (SISI)
To address these limitations, SISI adapts the same evidence-based genre, strategy, and gradual release models to signed texts and can be introduced as early as preschool (Holcomb, 2024). SISI gives students daily practice with multiple genres like personal narratives, informational reports, and opinion pieces by guiding them through brainstorming, organizing, expanding, revising, and sharing subprocesses. Teachers employ graphic organizers, mentor texts, and interactive feedback entirely in ASL to support students’ signed compositions, which is the video-based “published” signing that can be replayed and revised (as opposed to spontaneous “in-the-air” conversation). In doing so, SISI treats signed composition as a full-fledged literacy outcome rather than just serving as a bridge to English writing (Czubek, 2006; Holcomb & Eberwein, 2025).
Preliminary evidence: SISI
Holcomb (2024) conducted a mixed-methods pilot study in one K-2 classroom with four deaf students to test the feasibility and effectiveness of SISI. The veteran teacher had several years of SIWI professional development experience and received additional coaching focused on guiding students through signed composition. Pre- and post- measures collected across the school year showed clear gains in students’ narrative, informational, and persuasive signed texts. The teacher reported that SISI was practical and beneficial for daily instruction. She did, however, struggle with video-editing software, a reminder that technology support is essential when teaching published signed texts. Before scaling SISI, researchers needed to know whether teachers without prior SIWI or SISI experience could achieve comparable gains, which provides the rationale for this study. Because meaningful instructional change hinges on high-quality professional development (Darling-Hammond & Richardson, 2009), the next section summarizes those design features and how they shaped the current study’s training model.
Rationale for the present study: Effective professional development model
Instructional change requires sustained, high-quality professional development that deepens both content and pedagogical knowledge. Effective professional development models commonly have several core features (Darling-Hammond & Richardson, 2009). First, effective professional development focuses on deepening teachers’ understanding of both what they teach (content) and how they teach it (pedagogy). Without strengthening both, instructional changes may remain superficial and difficult to sustain. Second, impactful professional development provides intensive learning experiences that are extended over a meaningful period, rather than condensed into one-time events. Third, active, hands-on learning is emphasized where teachers engage with instructional strategies through modeling, practice, and feedback rather than passive observation. The fourth hallmark of effective professional development is collaboration, which brings teachers together with instructional experts and peers to discuss, refine, and apply new strategies within their specific contexts. Fifth, professional development efforts are also most powerful when they are data-driven by actively using classroom and student data to guide instructional decisions. Sixth, professional development is most likely to result in sustained instructional change when it aligns with ongoing school reform efforts and broader institutional goals where new practices are integrated into the overall mission and culture of the school rather than standing apart from it.
Research questions
This study builds on prior research by aligning SISI (Holcomb, 2024) with principles of effective professional development (Darling-Hammond & Richardson, 2009). SISI draws from well-established broader literacy practices and applies them to signed literacy instruction, with a focus on supporting students in producing signed compositions. The professional development was designed to model, scaffold, and support teachers’ uptake of signed literacy instruction in real classroom contexts. The central research question guiding this study was: What is the effect of professional development on teachers’ fidelity of implementation of SISI when supporting deaf students’ signed composition skills?
Method
Research design
A nonconcurrent multiple-baseline single-case research design (SCRD) was used to evaluate the effect of professional development on teachers’ implementation fidelity of SISI during literacy instruction. The independent variable was the professional development, and the dependent variable was teacher implementation of SISI, as measured by the SISI Fidelity Tool. This SCRD made it possible to identify causal relations between the independent and dependent variables by staggering the introduction of the professional development across teacher participants.
Teacher participants and setting
Participants included four teachers from an ASL/English bilingual deaf education program serving approximately 150 students. Each teacher worked in a different early elementary classroom (kindergarten through third grade). All four participants were deaf and fluent ASL users. Three were identified as White women and one as a Black man. Three of the four were identified as native-like ASL users. In terms of educational background, all teachers held at least a bachelor’s degree. Three held master’s degrees in deaf education along with standard teaching credentials. One teacher held a bachelor’s degree in a noneducation field and was employed under an emergency or provisional credential. Teaching experience ranged from 2 to over 20 years, with one teacher in the early-career range (2–4 years), two in the mid-career range (5–10 years), and one with more than 20 years of experience. Class sizes ranged from four to five students. Each classroom was equipped with tables, smartboards, whiteboards, and student iPads. See Table 1 for teacher demographics.
Table 1.
Teacher participant demographics
| Grade level | Race | Gender | Hearing status | ASL proficiency | Degree | Years of teaching experience | Number of students |
|---|---|---|---|---|---|---|---|
| Kindergarten | White | Woman | Deaf | Native-like | Bachelor’s (non-education field) | 2–4 years | 5 |
| 1st Grade | White | Woman | Deaf | Advanced ASL user | Master’s (Deaf Education) | 11–20 years | 4 |
| 2nd Grade | Black | Man | Deaf | Native-like | Master’s (Deaf Education) | 5–10 years | 4 |
| 3rd Grade | White | Woman | Deaf | Native-like | Master’s (Deaf Education) | 5–10 years | 5 |
Procedures
Baseline
At the start of the academic year, teachers maintained their typical language arts routines. Daily lessons were video-recorded with iPads and automatically uploaded to Swivl, a secure educational cloud system, for fidelity observation and evaluation. Baseline data were collected before each teacher received professional development, following a nonconcurrent multiple-baseline design to demonstrate experimental control among teachers. Each teacher entered the intervention phase on a staggered schedule across several months, receiving professional development and implementing SISI, while those still in baseline continued their usual instruction. Teacher 1 received professional development on October 6, 2024; Teacher 2 on November 9, 2024; Teacher 3 on December 14, 2024; and Teacher 4 on March 1, 2025. This design enabled direct comparisons between baseline and intervention phases among teachers.
During baseline, SISI fidelity data were collected for at least seven randomly selected instructional units per teacher. An instructional unit was defined as a complete cycle of signed composition from idea generation to publishing ASL texts. When no instructional unit could be identified because students did not engage in signed composition, we looked for a unit focused on written composition. If none was found, a fallback procedure was applied: a single language arts class, regardless of its focus, was treated as the instructional unit for fidelity scoring. For example, if a language arts class focused on English grammar drills, read-alouds, or vocabulary study, this would be treated as the instructional unit for fidelity scoring and would then be converted into a data point in the visual graph.
Each selected instructional unit was scored with the SISI Fidelity Tool, which contains 9 categories and 45 indicators: Accessible Learning Environment (2 indicators), Building Knowledge of the Content (4 indicators), Planning and Purpose (3 indicators), Modeling and Mentor Texts (6 indicators), Guided/Collaborative Text Construction (10 indicators), Explicit Instruction of Target Skills (5 indicators), Visual/Concrete Scaffolds (2 indicators), Metacognition and Collaboration (6 indicators), and Language Bridging (7 indicators). Sample indicators include:
Instructor discusses the communicative purpose of the signed text (inform, entertain, persuade). (Planning and Purpose)
Instructor explicitly discusses the purpose, audience, and audience expectations when introducing tasks and during composing. (Planning and Purpose)
Instructor posts or displays visual scaffolds (e.g., charts, diagrams, cue cards, posters) in the classroom to support the composing process. (Visual/Concrete Scaffolds)
Instructor deconstructs the structure of mentor texts with children (e.g., introduction, body, conclusion). (Modeling and Mentor Texts)
The teacher guides students in reviewing, evaluating, and revising the signed composition. (Guided/Collaborative Text Construction)
Instructor provides explicit instruction on the target skill by modeling and giving step-by-step explanation. (Explicit Instruction of Target Skills)
Instructor makes explicit comparisons between signed and written features. (Language Bridging)
Each indicator was scored as 1 (fully implemented), .5 (partially implemented), or 0 (not implemented). Full credit for Accessible Classroom Environment was awarded if the teacher maintained an accessible signing environment, regardless of whether composing occurred. For all other categories, scores of .5 or 1 required engagement in the composing process. If the teacher demonstrated an indicator that involved students producing written compositions, the teacher received partial credit (.5) for each approach implemented. Full credit (1) was awarded only when the indicator was implemented to support signed compositions. Indicator scores were totaled and converted to percentage data points for a visual graph and analysis.
Professional development
Each individual teacher participated in a comprehensive professional development session taking place across two days, facilitated by the lead author and conducted in teachers’ own classrooms. The training focused on the SISI manual, which delineated key fidelity categories and their associated instructional indicators. Fidelity expectations required teachers to consistently integrate all 45 instructional indicators within each instructional unit, from initial brainstorming to the final product sharing stage. During the training, teachers observed explicit modeling of each indicator, engaged in discussion and reflection, and participated in hands-on practice opportunities. Teachers collaborated with the lead author to identify signing skills to target for each student that was within their zone of proximal development. Then, they rehearsed teaching strategies that engaged students in producing signed compositions, delivering strategy and genre instruction via the gradual release model across each subprocess—brainstorming, organizing, producing, revising, and publishing.
Intervention (SISI Implementation)
Immediately following the professional development, each teacher began implementing SISI with their students. Daily lessons were video-recorded and automatically uploaded to Swivl, allowing the lead author to monitor instruction and evaluate fidelity. These observations informed bi-weekly coaching sessions, during which the lead author met with each teacher online to review progress, provide instructional support, and plan upcoming lessons.
Because teachers were required by their administration to follow the American Reading Company (ARC) pacing guides in both phases, SISI was woven into existing ARC topics and standards. For example, when ARC focused on fictional narratives about animals, teachers guided students to create signed compositions on the same theme. Similarly, during ARC units on bugs, students composed informational reports on bugs in ASL. The topic and length of each instructional unit typically varied based on ARC pacing requirements.
Measures
Fidelity tool and interrater reliability
Teacher implementation of SISI was scored with a 45-item fidelity tool covering ten categories (see Appendix A). Each item is rated 0 (not implemented), .5 (partially), or 1 (fully), yielding a unit-level total score (0–45). For interrater reliability, an external rater, trained in SISI, double-coded a stratified random sample of instructional units. During the entirety of baseline, three of the four teachers did not have students produce any signed or written composition; therefore, “any” language arts lesson was treated as an instructional unit per the study’s fallback rule. After removing the first week of instruction to minimize novelty bias, the lead author used a random-day generator to select seven baseline units for each teacher. In intervention, Teacher 1 taught 6 instructional units involving signed compositions, Teacher 2 taught 4, Teacher 3 taught 3, and Teacher 4 taught 4. Two units per teacher were randomly selected, which was 33%−67% of total units in the intervention phase for some teachers and 50% for others. All in all, there were 36 double-coded units across the study (28 units in baseline and 8 units in intervention). Both raters scored every item independently. When disagreements occurred, the raters met to reach a consensus for the analytic data set; however, reliability statistics were calculated on the original, unreconciled scores to avoid inflating agreement.
Social validity interview
At the end of the school year, the lead author conducted an individual, semistructured interview with each teacher to evaluate social validity of the ASL literacy instruction framework and its associated professional development. Interviews were held online in ASL, lasted 30 min, and were video-recorded for later review. The questions included six open-ended prompts covering (a) perceived value of SISI for students and for the teacher, (b) feasibility of day-to-day implementation, (c) satisfaction with coaching supports, (d) suggestions for improving the framework, and (e) intentions for future use. Recordings were viewed again by the lead author, and key points were summarized and reported in the results section.
Data analysis plan
Visual analysis was the primary method for evaluating changes in teacher implementation fidelity across baseline and intervention phases. Consistent with single-case research standards (Kratochwill et al., 2010), visual graphs were inspected for changes in level, trend, and variability within and across phases, the immediacy of effect following intervention onset, and the maintenance of high-fidelity scores throughout the intervention phase.
Fidelity was plotted as a percentage on the y-axis against school day on the x-axis. Data-point spacing differed by phase. Baseline points represented single language arts lessons sampled on nonsequential days; thus, horizontal spacing conveyed no temporal meaning. Intervention points represented entire composition units that typically spanned multiple weeks; therefore, the horizontal distance between points approximated unit duration. This distinction was noted on each graph and taken into account when interpreting immediacy and trend.
A functional relation was inferred when (a) each teacher showed an immediate change in level or trend upon entering intervention, (b) baseline levels were low and stable, and (c) three replications of the effect were evident across the four staggered baselines. To index magnitude of change, log-response-ratio (LRR) effect sizes (Pustejovsky, 2018) were calculated for each teacher using the Single-Case Effect Size Calculator (Pustejovsky et al., 2024). The LRR is appropriate for percentage outcomes; values were interpreted using recommended benchmarks (small = .5, medium = 1.0, large = 1.5+).
Results
This section presents three sets of findings. First, we report teachers’ fidelity of implementation of SISI, analyzed with a nonconcurrent multiple-baseline single-case design that compares baseline and post-training phases. Fidelity scores, expressed as percentage accuracy on the SISI Fidelity Tool, are graphed and examined for changes in level, trend, variability, and immediacy of effect. Second, we summarize interrater reliability statistics for the fidelity instrument. Third, we describe teachers’ perceptions of the professional development model and classroom feasibility, drawing on social validity interviews.
Interrater reliability outcomes
Item-level exact agreement across all double-coded units was 90.8% (1,961 agreements out of 2,160 items). See Table 2 for interrater reliability by teacher and phase. Because the fidelity items are ordinal, agreement was also assessed with quadratic-weighted K, which accounts for partial matches on the 0/.5/1 scale. Using a two-way random-effects model on unit-level totals, the pooled K was .88 (95% confidence interval = 0.85–0.91), indicating good-excellent interrater consistency by conventional benchmarks (K = .80+). Teacher-specific K values were .86 (T1), .99 (T2), .90 (T3), and .80 (T4), showing consistently strong reliability across raters and phases.
Table 2.
Interrater reliability by teacher and phase
| Teacher | Baseline units | % Agreement | Intervention units | % Agreement |
|---|---|---|---|---|
| 1 | 7 | 87.5% | 2 | 87.5% |
| 2 | 7 | 99.7% | 2 | 100% |
| 3 | 7 | 82% | 2 | 97.9% |
| 4 | 7 | 96.7% | 2 | 81.25% |
Visual analysis explanation
Teacher fidelity to SISI was analyzed across baseline and intervention phases using a nonconcurrent multiple-baseline single-case design. Fidelity scores, calculated as percentages using the SISI Fidelity Tool, were plotted over time and visually analyzed for changes in level, trend, variability, and immediacy of effect. See Fig. 1.
Figure 1.

Vertical dashed lines mark phase changes; baseline points plotted at nominal equal spacing, intervention points by unit start date.
Interpretation of time in the visual graph
In this study, time is represented differently across baseline and intervention phases due to the nature of instruction and the structure of data collection. During baseline, teachers did not engage students in the composing process. Instead, instruction focused on isolated grammar skills or other literacy-related activities that were not part of a coherent composition unit. As such, each data point in the baseline phase represents a single language arts class. The spacing between data points in this phase does not carry temporal meaning and does not reflect sequential instructional progression. For example, if one data point appears on Day 10 and the next on Day 20, this does not indicate a 10-day cohesive instructional unit; each point corresponds to a standalone lesson.
In contrast, during the intervention phase, teachers implemented full composition units aligned with the SISI model introduced through professional development. Each data point reflects the fidelity of implementation across an entire instructional unit, which involves a connected sequence of generating ideas, organizing content, revising, and publishing signed texts. These units embedded specific instructional methods throughout, such as the use of graphic organizers and mentor texts, and usually spanned multiple weeks. Therefore, the time between intervention-phase data points reflects the duration of each instructional unit. For example, if a data point is reported on Day 70 and then another on Day 90, it indicates that there were two instructional units involving two different composition projects: the first unit started on Day 70 and lasted approximately 20 school days, and the second unit began on Day 90. Although school days are used as a common timeline, baseline data represent discrete, nonsequential lessons, whereas intervention data capture sustained instructional efforts across full composition units; this distinction is critical when interpreting the x-axis.
Visual analysis outcomes
During baseline conditions, three teachers had near-zero levels of fidelity, with only one indicator consistently completed (in the category of accessible classroom environment). Teacher 3 (Grade 3) occasionally asked students to produce written compositions in response to English literature on single days, but these isolated tasks still did not form a multi-day composing cycle with approximately 5%−15% fidelity to SISI indicators across observations. Visual analysis of data indicates that, for all teachers, there was an immediate change in level, with all teachers achieving about 45%−60% accuracy for their first post-training unit. The data for Teachers 1–3 also show an increasing trend, with all of these teachers reaching 80%−90% fidelity for their final sessions. Teacher 4 maintained moderately high levels of fidelity during intervention, with no increasing trend. Baseline variability was minimal (all points under 7%), whereas intervention variability remained low once fidelity exceeded 80%, and high scores were maintained across the final two instructional units for every teacher. These three replications of an immediate, sustained level change satisfy the decision rule for a functional relation specified in the data analysis plan. To summarize, these data patterns indicate there were consistent positive effects across teachers, supporting the identification of a functional relation between the use of professional development and teacher outcomes.
Effect size
We calculated the log response ratio (increasing) effect size (Pustejovsky, 2018) to identify the magnitude of change between baseline and intervention conditions (see Table 3) using the Single-Case Effect Size Calculator (Pustejovsky et al., 2024, see Supplemental Materials for related code). Effect sizes were relatively consistent across teachers (2.04–3.72), with a slightly smaller effect for Teacher 3 due to her more elevated baseline. These data support visual analysis in concluding that the intervention was effective, with changes across teachers being both consistent and large in magnitude.
Table 3.
Effect sizes
| Teacher | LRRi effect size | Standard error |
|---|---|---|
| 1 | 3.72 | 0.05 |
| 2 | 3.71 | 0.06 |
| 3 | 2.04 | 0.22 |
| 4 | 3.36 | 0.04 |
Social validity outcomes
At year’s end, the lead author met individually with each teacher (30-min semistructured interview) to gather feedback on value, feasibility, and needed improvements. All four teachers wished they had received professional development earlier, noting that an earlier start could have supported greater student growth. Because of the staggered multiple-baseline design, professional development was delivered between September and March; despite this delay, each teacher reported that both they and their students benefited from the subsequent instructional changes. Teachers highlighted concrete shifts: one described collaborative signing, students co-constructing signed texts before translating to writing, as transformative, while another discussed ideas to reorganize her classroom into centers and small-group rotations to differentiate instruction for next year. They observed increased engagement, deeper “literary understanding,” and more strategic moves during composing, especially when students could share signed texts with authentic audiences. Collectively, these anecdotes suggest SISI fosters students’ linguistic agency and ownership. Looking ahead, all teachers expressed eagerness to start signed literacy instruction at the beginning of the next school year and felt more confident aligning the framework with existing curriculum demands and showed flexibility in their thinking regarding tailoring instruction to meet the diverse needs of their deaf students.
Discussion
This study examined whether targeted professional development could translate signed literacy instruction from conceptual frameworks into daily classroom practice. Using a non-concurrent multiple-baseline single case design with four K-3 teachers, this study tested whether SISI training would increase teachers’ implementation fidelity on 45 instructional indicators. Baseline instruction reflected long-standing “business-as-usual” patterns involving isolated English grammar drills, read-alouds, and independent written responses with no systematic support for composing in either ASL or English. Once SISI was introduced, fidelity climbed rapidly (from 0%−5% to 60%−80%) and remained high, documenting a functional relation between professional development and teacher behavior. Below, these findings are situated within broader conversations about (a) what counts as literacy in deaf education, (b) how and why rapid instructional change occurred, (c) equity and sustainability considerations, and (d) directions for research and practice.
What counts as literacy in deaf education
This study contributes to ongoing calls in the literature to expand definitions of literacy and to rethink how literacy is taught in deaf education (Gibson & Byrne, 2024; Holcomb, 2024). Limiting literacy to print-only forms restricts the expressive capacities of signing deaf students and constrains their access to foundational composing experiences that occur before they begin to write texts. Reframing literacy to include signed composition repositions ASL as not just a bridge to English but a valid and generative site of literacy in its own right. Literacy is a layered, recursive process, and early composing, in particular, supports development in language, literacy, and thinking (Bingham et al., 2018; Flower & Hayes, 1981; Quinn et al., 2021). For signing deaf students developing bilingually or multilingually, this process must extend beyond spoken language and print. Like print literacy, signed literacy develops through instruction that supports students in viewing, analyzing, and composing texts in video form (Holcomb, 2024). This study operationalized that approach by guiding teachers to embed strategy and genre instruction in signed composition as a core classroom practice.
How and why rapid instructional changes occurred
Fluency in ASL, while necessary, proved insufficient for teachers to effectively teach signed literacy. The combination of a focused 2-day workshop that deepened teachers’ content and pedagogy knowledge and bi-weekly virtual coaching produced the observed instructional shift. This model aligns with seminal professional development literature (Darling-Hammond & Richardson, 2009); it was content-specific, active, collaborative, data-driven, and extended over time. Immediate feedback from the lead researcher, who also served as a coach, following the 10-hr workshop likely fueled teacher enthusiasm and promoted early implementation. All teacher participants, spanning grades K-3 and ranging from provisional to veteran, moved from near-zero fidelity in baseline to well above the 60%−80% benchmark after SISI training. Gains quickly appeared within one instructional unit, and then increased and held steady thereafter, consistent with meta-analyses showing that focused, feedback-rich professional development can yield moderate to large effects on instructional quality (Kraft et al., 2018). Strategic and interactive signing instruction was successfully applied across teachers, grade levels, and pacing guides, which indicates that the gradual release model, strategy instruction, and genre instruction can transfer effectively to signed literacy instruction.
The pattern of rapid fidelity gains appears to hinge on four mutually reinforcing factors. First, teachers who received the workshop early in the year had a longer runway to experiment, get feedback, and adjust practice; their comments suggest that front-loading the training during preservice week would multiply the minutes available to sharpen instruction sooner and provide more signed composition experiences and opportunities. Second, because the framework was woven into the school’s existing American Reading Company (ARC) pacing guide rather than functioning in isolation, teachers could treat signed composition as a lens for deepening mandated units, creating bug-themed informational videos, for example, without sacrificing required curriculum content. While integrating SISI into ARC units was done in this study, future work must consider how pacing expectations might constrain the time and flexibility needed for deep engagement with strategic approaches to signed composition. Teachers in rigid instructional systems may require scheduling negotiation to adapt pacing guides without compromising SISI quality. Third, the provisionally licensed teacher was able to attain high fidelity. This pattern shows that a clearly scaffolded routine paired with targeted coaching can lift the performance of newer teachers, giving them rapid access to evidence-based practices. Finally, teachers’ enthusiasm to keep using the approach next year showcases its social validity; when teachers have higher senses of confidence in what they are doing and see immediate improvements in student engagement, they are more willing to persist through the inevitable learning curve of new planning demands, which increases the likelihood that high-leverage practices will endure after external support tapers off. Beyond structural supports, changes in teacher identity and belief may also have played a role. As teachers began to see signed composition as a legitimate and impactful form of literacy instruction, they were more willing to revise routines that had long prioritized English-only outcomes.
Challenges encountered
The most veteran teacher’s SISI fidelity plateaued near 60%. Two factors likely contributed. First, she received professional development late in the year and participated in fewer coaching sessions, limiting practice opportunities. Second, she struggled with the technology required to record and edit signed texts, a difficulty also noted in the pilot study (Holcomb, 2024). Teachers steeped in long-standing English-focused methods and unfamiliar with video tools may need extra time, and differentiated coaching, to unlearn ingrained routines and master the technical skills essential for a bilingual framework like SISI. Professional development that accelerates novice uptake while providing sustained support for veterans can reduce instructional variability across classrooms. Opportunities for teachers to produce their own signed compositions and use video editing for revision should be integrated into teacher preparation and ongoing professional development.
Limitations and future directions
While this study provides strong evidence of a causal relation between professional development and improvements in instructional fidelity for signed literacy instruction, several limitations warrant consideration. First, SCRDs are intended to demonstrate functional relations within individuals or small groups, rather than to support broad generalizations across all populations and settings. This study included four teachers from a single ASL/English bilingual program. As such, findings are most applicable to similar contexts; namely, classrooms with teachers who are fluent in ASL, and may not extend to settings where teachers have varying levels of ASL proficiency. Second, the present study focused exclusively on teacher outcomes. While shifts in instructional practice are a key precursor to student growth, student outcomes were not reported. Future research should incorporate designs that evaluate the impact of high-fidelity SISI implementation on students’ signed and written composition outcomes. Third, the lead author served in a dual role as both the trainer and primary rater of instructional fidelity. Despite efforts to ensure impartiality and interrater reliability through independent scoring with a second rater, which was then compared, this dual role may have introduced potential bias. Because the primary rater could see whether a lesson came from baseline or intervention, it was not possible for the phases to be completely unknown. Although quadratic-weighted K indicated strong agreement between both raters, awareness of condition could still bias scoring. Ideally, future studies will involve two raters who are entirely independent of the training team. Further work is needed to evaluate how scalable and sustainable SISI may be across a range of deaf education contexts.
Conclusion
This study advances the call to broaden literacy instruction in deaf education. By adapting evidence-based practices to the signed modality, SISI offers a structured way for students to compose, revise, and share intentional signed texts, rather than relying on spontaneous “in-the-air” signing or treating ASL only as a bridge to English. The clear fidelity gains following professional development show the framework is both feasible and effective. Without targeted training, signed literacy instruction was largely absent even in fluent-signer classrooms; with it, teachers implemented evidence-based methods at high levels. These results point to a broader challenge in the field: the need to move beyond informal uses of ASL in the classroom and toward systematic instruction that treats ASL as a language to develop academic, literacy, and cognitive skills. Only then can bilingual instruction in deaf education fully reflect the promise of treating both ASL and English as equally important languages for literacy.
Supplementary Material
Funding
Research reported in this publication was supported by the National Institute on Deafness and Other Communication Disorders of the National Institutes of Health under Award Number R21DC021024. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Footnotes
Conflicts of interest: None declared.
References
- Andrews R, Torgerson C, Beverton S, Freeman A, Locke T, Low G, & Zhu D (2006). The effect of grammar teaching on writing development. British Educational Research Journal, 32(1), 39–55. 10.1080/01411920500401997. [DOI] [Google Scholar]
- Bingham GE, Quinn MF, McRoy K, Zhang X, & Gerde HK (2018). Integrating writing into the early childhood curriculum: A frame for intentional and meaningful writing experiences. Early Childhood Education Journal, 46, 601–611. 10.1007/s10643-018-0894-x. [DOI] [Google Scholar]
- Boon RT, Barbetta PM, & Paal M (2018). The efficacy of graphic organizers on the writing outcomes of students with learning disabilities: A research synthesis of single-case studies. Learning Disabilities: A Multidisciplinary Journal, 23(2), 1–17. 10.18666/ldmj-2018-v23-i2-9042. [DOI] [Google Scholar]
- Culham R (2023). Writing thief: Using mentor texts to teach the craft of writing. Routledge. [Google Scholar]
- Czubek TA. (2006). Blue Listerine, parochialism, and ASL literacy. Journal of Deaf Studies and Deaf Education, 11(3), 373–381. 10.1093/deafed/enj033. [DOI] [PubMed] [Google Scholar]
- Darling-Hammond L, & Richardson N (2009). Teacher learning: What matters? Educational Leadership, 66(5), 46–53. [Google Scholar]
- Duthie C (1994). Nonfiction: A genre study for the primary classroom. Language Arts, 71(8), 588–595. 10.58680/la199425240. [DOI] [Google Scholar]
- Fearn L, & Farnan N (2007). When is a verb? Using functional grammar to teach writing. Journal of Basic Writing, 26(1), 63–87. https://www.jstor.org/stable/43443838. [Google Scholar]
- Fisher D, & Frey N (2008). Better learning through structured teaching: A framework for the gradual release of responsibility. ASCD. [Google Scholar]
- Flower L, & Hayes JR (1980). The cognition of discovery: Defining a rhetorical problem. College Composition and Communication, 31 (1), 21–32. [Google Scholar]
- Flower L, & Hayes JR (1981). A cognitive process theory of writing. College Composition and Communication, 32 (4), 365–387. [Google Scholar]
- Gibson H, & Byrne A. (2024). Alfabetização em ASL e desenvolvimento do pensamento crítico. Revista Brasileira de Alfabetização, 22, 1–20. 10.47249/rba2024933. [DOI] [Google Scholar]
- Graham S (2018). A revised writer-within-community model of writing. Educational Psychologist, 53(4), 258–279. 10.1080/00461520.2018.1481406. [DOI] [Google Scholar]
- Graham S, McKeown D, Kiuhara S, & Harris KR (2012). A meta-analysis of writing instruction for students in the elementary grades. Journal of Educational Psychology, 104(4), 879–896. 10.1037/a0029185. [DOI] [Google Scholar]
- Graham S, & Perin D (2007). A meta-analysis of writing instruction for adolescent students. Journal of Educational Psychology, 99(3), 445–476. 10.1037/0022-0663.99.3.445. [DOI] [Google Scholar]
- Harris KR (2024). The self-regulated strategy development instructional model: Efficacious theoretical integration, scaling up, challenges, and future research. Educational Psychology Review, 36(4), 104. 10.1007/s10648-024-09921-x. [DOI] [Google Scholar]
- Harris KR, Kim YS, Yim S, Camping A, & Graham S (2023). Yes, they can: Developing transcription skills and oral language in tandem with SRSD instruction on close reading of science text to write informative essays at grades 1 and 2. Contemporary Educational Psychology, 73, 102150. 10.1016/j.cedpsych.2023.102150. [DOI] [Google Scholar]
- Hayes JR, & Flower LS (1980). Identifying the organization of writing processes. In Gregg LW & Steinberg ER (Eds.), Cognitive Processes in Writing (pp. 3–30). Lawrence Erlbaum Associates. [Google Scholar]
- Hayes JR, & Flower LS (1986). Writing research and the writer. American Psychologist, 41(10), 1106–1113. 10.1037/0003-066X.41.10.1106. [DOI] [Google Scholar]
- Holcomb L (2024). Exploring signed literacy in elementary deaf students through evidence-based instructional methods. Sign Language Studies, 24(4). 10.1353/sls.2024.a936335. [DOI] [Google Scholar]
- Jones S, Myhill D, & Bailey T (2013). Grammar for writing? An investigation of the effects of contextualised grammar teaching on students’ writing. Reading and Writing, 26(8), 1241–1263. 10.1007/s11145-012-9416-1. [DOI] [Google Scholar]
- Kraft MA, Blazar D, & Hogan D (2018). The effect of teacher coaching on instruction and achievement: A meta-analysis of the causal evidence. Review of Educational Research, 88(4), 547–588. 10.3102/0034654318759268. [DOI] [Google Scholar]
- Kratochwill TR, Hitchcock J, Horner RH, Levin JR, Odom SL, Rindskopf DM, & Shadish WR (2010). Single-case designs technical documentation (version 1.0). What Works Clearinghouse. [Google Scholar]
- Ladson-Billings G (1992). Reading between the lines and beyond the pages: A culturally relevant approach to literacy teaching. Theory Into Practice, 31(4), 312–320. 10.1080/00405849209543558. [DOI] [Google Scholar]
- Limpo T, & Alves RA (2018). Effects of planning strategies on writing dynamics and final texts. Acta Psychologica, 188, 97–109. 10.1016/j.actpsy.2018.06.001. [DOI] [PubMed] [Google Scholar]
- Mirra N, & Garcia A (2021). In search of the meaning and purpose of 21st-century literacy learning: A critical review of research and practice. Reading Research Quarterly, 56(3), 463–496. 10.1002/rrq.313. [DOI] [Google Scholar]
- Myhill D, Jones S, Watson A, & Lines H (2013). Re-thinking grammar: The impact of embedded grammar teaching on students’ writing and metalinguistic understanding. Research Papers in Education, 27(2), 139–166. 10.1080/02671522.2011.637640. [DOI] [Google Scholar]
- Myhill D, & Watson A (2014). The role of grammar in the writing curriculum: A review of the literature. Child Language Teaching and Therapy, 30(1), 41–62. 10.1177/0265659013514070. [DOI] [Google Scholar]
- National Governors Association Center for Best Practices, & Council of Chief State School Officers. (2010). Common Core State Standards for English language arts & literacy in history/social studies, science, and technical subjects. http://www.corestandards.org/ELA-Literacy. [Google Scholar]
- Pearson PD, & Duke NK (2002). Comprehension instruction in the primary grades. In Block CC & Pressley M (Eds.), Comprehension instruction: Research-based best practice (pp. 247–258). Guilford Press. [Google Scholar]
- Pearson PD, McVee MB, & Shanahan LE (2019). In the beginning: The historical and conceptual genesis of the gradual release of responsibility. In McVee MB, Ortlieb E, Reichenberg JS, & Pearson PD (Eds.), The gradual release of responsibility in literacy research and practice (pp. 1–21). Emerald Publishing. 10.1108/S2048-04582019000010001. [DOI] [Google Scholar]
- Pressley M, & Allington RL (2014). Reading instruction that works: The case for balanced teaching (3rd ed.). Guilford Press. [Google Scholar]
- Pustejovsky JE (2018). Using response ratios for meta-analyzing single-case designs with behavioral outcomes. Journal of School Psychology, 68, 99–112. 10.1016/j.jsp.2018.02.002. [DOI] [PubMed] [Google Scholar]
- Pustejovsky JE, Chen M, Grekov P, & Swan DM (2024). Single-case effect size calculator (version 0.7.3) [web application]. https://jepusto.shinyapps.io/SCD-effect-sizes/
- Quinn MF, & Bingham GE (2019). The nature and measurement of children’s early composing. Reading Research Quarterly, 54(2), 213–235. 10.1002/rrq.232. [DOI] [Google Scholar]
- Quinn MF, Bingham GE, & Gerde HK (2021). Who writes what when? Examining children’s early composing. Reading and Writing, 34(1), 79–107. 10.1007/s11145-020-10063-z. [DOI] [Google Scholar]
- Rose D (2018). Languages of schooling: Embedding literacy learning with genre-based pedagogy. European Journal of Applied Linguistics, 6(1), 59–89. 10.1515/eujal-2017-0008. [DOI] [Google Scholar]
- Shanahan T, Callison K, Carriere C, Duke NK, Pearson PD, Schatschneider C, & Torgesen J (2010). Improving reading comprehension in kindergarten through 3rd grade: IES practice guide (NCEE 2010–4038). Department of Education, Institute of Education Sciences: What Works Clearinghouse. U.S. [Google Scholar]
- Vygotsky LS (1978). Mind in society: The development of higher psychological processes. Harvard University Press. [Google Scholar]
- Wolbers K, Dostal H, & Holcomb L (2023). Teacher reports of secondary writing instruction with deaf students. Journal of Literacy Research, 55(1), 28–50. 10.1177/1086296X231163124. [DOI] [Google Scholar]
- Woodard R, Vaughan A, & Machado E (2017). Exploring culturally sustaining writing pedagogy in urban classrooms. Literacy Research: Theory, Method, and Practice, 66(1), 215–231. 10.1177/2381336917718809. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
