Compass

Compass

Undergraduate research experiences provide numerous personal, professional, and societal benefits, including enhancing student learning and broadening student participation and retention in diverse fields of study. However, scaling effective environments for this kind of research training is difficult when mentoring resources are limited. To overcome the practical limitations of scaling such training environments, we introduce **Agile Research Studios (ARS), a learning ecosystem of socio-technical supports_ _designed to scaffold students to plan, seek help, and reflect as they learn to self-direct complex work within this community. **(Figures 1 and 2)

As I studied how students practiced in the ecosystem, I observed that students still faced critical gaps in their planning, helpseeking, and reflection processes. For instance, upon studying the expert planning process, I recognized that expert design-researchers visualize the argumentation structure of their design problem, diagnose risks in that structure, and then focus their iteration plan on first tackling the most critical risks.  Our previous scaffolding focused on using the Sprint Log tool to help students break down their plans into detailed tasks, and SIG planning meetings, where they could get mentor feedback on their planning strategies for the upcoming week. However, we lacked explicit scaffolding for two key components of the expert planning process -- visualizing the design problem and diagnosing risks. To address these gaps in our planning scaffolds, we augmented the planning subsystem with Polaris, a representation and risk-assessment tool to scaffold students in constructing and evaluating the argumentation behind their designs. (Figures 3 and 4)

However, this approach of augmenting the subsystems with new tools and processes in response to each emerging gap can have unintended consequences that disrupt the overall learning ecosystem. Namely,** the introduction of more tools and processes means that the subsystems and the overall ecosystem grow in complexity. **For instance, despite the introduction of Polaris to help students represent and assess risks in their design argumentation, we learned that students conducting design-research needed additional representations for other aspects of their project rationale (i.e. interface and systems arguments to argue for how a design should be instantiated nd how it will functionally work, or an approach tree to help students argue for the novelty of approach compared to existing approaches.) In response, we expanded Polaris as linked canvas tools, templates with additional risk-assessment prompts that helped students represent and assess different argumentation layers of their approach.  To provide students with direct coaching and feedback on their argumentation, we introduced Mysore, a feedback and practice venue where students can workshop a risky slice of their argumentation alongside a research mentor. While well intended, a complex learning ecosystem rich with supports can actually inhibit a novice trying to navigate that ecosystem as they learn. Despite these augmentations, I observed that students still struggled to monitor and improve the ways in which they practice metacognitive strategies across the community supports available to them. 

While the ARS model introduces this rich ecosystem of supports designed to scaffold students in these metacognitive strategies, the approach fails to explicitly train students to think about how to execute these strategies across the ecosystem of supports available to them, as experts coach them to do. In my work, I have observed process management breakdowns (Figure 5). As an example, students who are planning out their work may set deliverables that address risks they identified at the beginning of their sprint. As they continue working, the Polaris risk assessment tool, a Mysore argumentation feedback session, or a SIG planning meeting may surface a new risk. In such cases, we’ve observed that students often continue with their previous plan until their mentor raises an issue, rather than adapting their plan to the new risk that was surfaced. This is an example of a planning process breakdown, where students fail to link the planning feedback they’ve received via ARS supports to the ways in which they revise their planning process. As a consequence, students miss out on opportunities to implement the new strategies that the ecosystem of ARS supports helped them surface. For the ecosystem, this can mean poor utilization of the existing supports, limiting it’s potential scalability. For the students, this can mean hindering their metacognitive practice and overall development as a self-directed design-researcher. 

To overcome barriers to fully leveraging the ecosystem of available supports, I introduce the Compass: a process management framework that guides students in how to monitor and revise their metacognitive practice as they move across available supports in the ecosystem (Figures 6 and 7). The process management framework uses a combination of on-action dashboards and  in-action cues.  On-action dashboards help students zoom out to plan out, monitor, diagnose, and improve the strategies they practice across supports in the ecosystem (see Figure 6). For example, the planning dashboard helps students assess whether the risks they identified in the Polaris tool are aligned with the deliverables they detailed in their Sprint Log tool. In-action cues help students identify opportunities to enact desired practices as they work with the supports in the ecosystem (see Figure 7). For example, in-action cues can prompt a student to incorporate planning feedback they received in a SIG meeting (e.g. reprioritizing their planned next steps) into their Sprint Log tool, or remind students mid-week to work in alignment with their risks and deliverables.  This process management framework embeds expert process strategies that model how mentors coach students to monitor and revise the ways in which they execute their practice within the ecosystem (e.g. checking that they are working towards a deliverable that mitigates the riskiest parts of their design and research work as they work). Further, thiis framework integrates into the existing ecosystem supports that students already use to practice (e.g. a dashboard view that pulls summative planning process data from the sprint log tool, or as Slack cues sent in their project channel shortly after their SIG meeting). By integrating these expert process strategies into the existing ecosystem supports, these process management scaffolds enable students to recognize and resolve gaps in their practice, as they are practicing in the community. Integrating a process management framework into existing supports can help students leverage existing opportunities to implement the metacognitive strategies that the ARS ecosystem is designed to surface. Such scaffolding better equips students to learn how to fully harness the power of socio-technical learning ecosystems like ARS.

Compass image 1

Figure 1: Figures 1 and 2. A Socio-technical Ecosystem for Research Training

(1) To scale effective research training given limited mentoring resources in the community, Agile Research Studios (ARS) fosters a _self-directed learning environment _that takes a dispersed control approach that distributes learning across an ecosystem of existing supports in the community.

(2) The Agile Research Studios (ARS) model introduces an _ecosystem of socio-technical supports _ -- virtual tools, agile processes, social structures, training resources, feedback and practice venues. These components weave together as subsystems designed to scaffold students to plan, seek help, and reflect as they learn to self-direct complex work within the community. Here, we emphasize the components that make up the planning subsystem.

Compass image 2

Figure 2: Figures 3 and 4. Augmenting An Ecosystem to Address Critical Subsystem Gaps

(3) While the existing ARS subsystem scaffolded students to construct iteration plans via the Sprint Log tool and SIG planning meetings, there were still critical gaps -- namely, scaffolding students to visualize design problem structures and diagnose them for risks as experts do.

(4) Polaris is a learner-centered diagnostic tool that embeds expert knowledge and practice to scaffold novices to construct and evaluate their design arguments. Novices use the design argument template (left) and the reflection prompts (right) to visually and procedurally evaluate their design arguments for structural issues.

Compass image 3

Figure 3: Figure 5. Process Scaffolds to Execute Effective Practice within an Ecosystem

As we augment existing subsystems with new tools and processes, the overall learning ecosystem grows in complexity. We began to observe process breakdowns as students attempted to leverage the ecosystem towards their learning. Here, we see planning process breakdowns, where students still struggle to connect their understanding of the problem to critical risks they have, and then connect those risks to their iteration plan.

Compass image 4

Figure 4: Figures 6 and 7. The Compass: A Planning Process Framework The Compass is a planning process management framework implemented as  (6) a weekly planning dashboard and (7) planning cues via Slack.

(6) The dashboard provides students with a view that helps them assess whether their weekly deliverables will address their project risks. (7) The planning cues serve as in-action check-ins that remind students to execute their planning processes at opportune moments (e.g. having students sprint plan before SIG meeting, asking them mid-week if they are still on track for their deliverables, or checking in after a mysore or SIG session to see if they updated risks or plans based on mentor feedback.

Team

Faculty

  • Haoqi Zhang

Ph.D. Students

  • 🎓 Leesha Maliakal Shah

Masters and Undergraduate Students

  • None