tl-its-umich-edu / my-learning-analytics

My Learning Analytics (MyLA)
Apache License 2.0
36 stars 39 forks source link

[NEW] Discussion View #564

Open justin0022 opened 5 years ago

justin0022 commented 5 years ago

Todo

Background

Sanam recently joined the UBC Learning Analytics team, and she’ll be helping us design and create some additional dashboards for MyLA. In the past two years, Sanam has been working at UBC in the Faculty of Arts as a LA practitioner. In her role, she works with faculty and programs to help answer teaching and learning-related data questions.

Prior to that, Sanam had conducted research on student-facing LA dashboards as part of her graduate studies at Simon Fraser University. In her research, she focused on studying effects of information presented through dashboards on students with individual differences. She was part of the team that worked on designing and implementing learner dashboards for Canvas discussions. The dashboards were evaluated in the context of several courses at SFU.

Overview

The visualization shows a list of recommended keywords to talk about in the discussion topic. The keywords are selected by the instructor on the basis of relevance and importance to the topic and goals of the discussion activity [1, 2].

For each keyword the color coding shows:

Goals

This visualization allows students to monitor the quality of their own contributions in the context of an online discussion activity and aims at supporting self-regulation by encouraging them to focus on expanding their breadth of participation through covering a wider range of aspects that are related to the topic of discussion, as well as, increasing the depth of their contributions by integrating and articulating those aspects in a coherent manner [3].

The keywords identified by the instructor communicates some level of expectation around productive participation in the discussion and can be considered as an external standard/reference that students can refer to for monitoring and evaluating their progress.

In addition, students are offered relative comparison with their peers. This is not meant to be used as a measure of comparative performance. Rather the goal is to help the student get a sense of how their classmates are engaging around different aspects of the discussion and encourage them to learn from their peers by reading and reviewing their messages and trigger further engagement and collaboration. We may need to revise the visualization to avoid direct performance comparisons.

In the future, adding an individual goal-setting component (see Future Ideas below) can support agency and allow students to monitor their discussion activity with respect to explicit self-defined standards.

It’s important to recognize that individual differences might affect interpretation of the information presented to the students in terms of reflection and taking actions [4]. Hence, the role of scaffolding is critical in helping students prioritize all these different standards/reference points (activity goals, peers and self) and understand the value and limitations of each in a specific context.

Coherence and Latent Semantic Analysis (LSA)

Coherence has been described as “the unifying element of good writing” and hence it can be used in a way to measure quality of text. A coherent text reveals use of different strategies to connect and integrate disparate pieces of information together and to prior knowledge. Latent Semantic Analysis is a Natural Language Processing technique that can be leveraged to measure coherence of the text [5, 6]. LSA algorithms compare two adjacent units of text at the semantic level to evaluate their relatedness. In the context of discussions, this can be applied at granularity level of sentences [7].

The output value of LSA for coherence theoretically ranges between 0 to 1. Higher values shows higher semantic similarity between the sentences, and thus a more coherent message, while smaller values indicate low semantic similarity and incoherent messages. We can translate the output value into three levels: low, moderate and high. This requires identifying thresholds and one way to approach that is by using a corpus of messages that is pre-coded based on Cognitive Presence using the coding instrument that operationalizes assessment of critical discourse and reflection [8].

Integration with Learning Design

To support student use of this discussion analytics and increase student awareness and contribute to productive patterns of participation in the discussion, it needs to be integrated with the elements of learning design. In other words, situating the use of analytics as a part of the discussion activity and tying it to expected outcomes and goals. Guidelines on design and facilitation of effective discussions can be found in collaborative learning literature [9-10].

Another element is to provide guidance to students about when the analytics might practically be consulted.

Example of a discussion activity:

Discussion Activity Example

Future Ideas

Technical Requirements

Prototype View

Screen Shot 2019-07-04 at 11 08 16 AM

The heatmap indicates the coherence of the topics - darker indicates higher coherence, lighter indicates lower coherence.

Screen Shot 2019-07-04 at 11 06 43 AM

Clicking on a topic shows the coherence of that topic, as well as snippets of where the user used this topic in the discussion. There's also the ability to switch to see examples of other students' discussion posts that mention this topic.

References

  1. Beheshitha, S. S., Hatala, M., Gašević, D., & Joksimović, S. (2016, April). The role of achievement goal orientations when studying effect of learning analytics visualizations. In Proceedings of the sixth international conference on learning analytics & knowledge (pp. 54-63). ACM.
  2. Varying Effects of Learning Analytics Visualizations for Students with Different Achievement Goal Orientations (Doctoral dissertation, Communication, Art & Technology: School of Interactive Arts and Technology).
  3. Wise, A. F. (2014, March). Designing pedagogical interventions to support student use of learning analytics. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 203-211). ACM.
  4. Gašević, D. et al. 2015. Let’s not forget: Learning analytics are about learning. TechTrends. 59, 1 (2015), 64–71.
  5. https://en.wikipedia.org/wiki/Latent_semantic_analysis
  6. Foltz, P. W. (2007). Discourse coherence and LSA. Handbook of latent semantic analysis, 167-184.
  7. Foltz, P.W. et al. 1998. The measurement of textual coherence with latent semantic analysis. Discourse processes. 25, 2-3 (1998), 285–307.Shirazi Beheshtiha, S. S. (2015).
  8. Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of distance education, 15(1), 7-23.
  9. Wise, A.F. and Chiu, M.M. 2014. The impact of rotating summarizing roles in online discussions: Effects on learners’ listening behaviors during and subsequent to role assignment. Computers in Human Behavior. 38, (2014), 261–271.
  10. Rovai, A.P. 2007. Facilitating online discussions effectively. The Internet and Higher Education. 10, 1 (2007), 77–88.
justin0022 commented 5 years ago

If you have any questions, please post them here - Sanam and I will try our best to answer them.

justin0022 commented 5 years ago

@pushyamig for work on this view, could we create a branch in this repository? That way anyone who is interested in MyLA would have greater visibility into the progress of the Discussion view.

pushyamig commented 5 years ago

@pushyamig for work on this view, could we create a branch in this repository? That way anyone who is interested in MyLA would have greater visibility into the progress of the Discussion view.

Yes, you can. I suggest that having a separate Project board for the Discussion view issue so that way all your issue goes in that bucket. Currently, they all going in the default bucket MyLA-Default-Project