SCORM+&+Learning+Analytics


 * __ SCORM __**

SCORM as a Learner Management System has many disadvantages because it does not manage the learning in itself. SCORM-LST was developed as a framework to describe collaborative learning, assessment and facilitation for multiple users. SALMS is short for SCORM Adaptive Learner Management Systems, and works as the interpreter of SCORM-LST. Learning 2.0 is mapped on the same idea as Web 2.0 where open source participation and sharing are key terms. Standardisation is necessary to make e-learning objects interact between systems to make it compatible with the servers of a Learning Management System and to be able to re-use and share contents. As discussed in the previous posts SCORM is based on the former idea.

Despite the benefits that SCORM offers in the field of standardization, it has its own limitations. SCORM is highly technical whereas, on the other hand, the learning process involves humans and is complex rather than technical. Learning is quite subjective, SCORM is rather objective. A major SCORM drawback is that it transforms learning into a rather simplistic process which should never be the case. This make the implementation of SCORM standard in education less of a reality.

**__ Learning Analytics __** Learning analytics is "the measurement, collection, analysis and reporting of data about learners and their context, for purposes of understanding and optimising learning and the environments in which it occurs" (1st International Conference on Learning Analytics and Knowledge). Siemens defines it as being "the use of intelligent data, learner-produced data, and analysis models to discover information and social connections, and to predict and advise on learning." [] It is based on the learning process and thus is based on the relationships between different stakeholders in learning like the learner, content, institution and educator.

According to the Horizon Report (2013), learning analytics are "envisioned as an effective, efficient way to assess student responses, provide immediate feedback, and make adjustments in content delivery and format." Present web tracking tools have the ability to track exact student behaviours, can keep track of variables like the amount of time and clicks done on a page and more complex information such as resilience and retention of concepts. In order for learning analytics to be made a reality, rather than a farfetched truth, a cycle has to be followed. The cycle is made up of: 1. Course-level

Social network analysis, learning trails, discourse analysis

2. Educational data-mining
Predictive modelling, clustering and pattern mining. We strive to move beyond the LMS. Educational data mining is one tool in learning analytics. This gives the students the opportunity to compare what he did to other to guide him as a student on what he should do every 15 minutes. The only problem with these systems is that students get used to being told what to do by the software, which will not happen in the real world. Also they will not be transmitted 21st century skills, but will become software dependent. Thus whether or not to make use of educational data mining in online courses is quite subjective and depends on the lecturer's personal preferences (Duval, 2013).

3. Intelligent curriculum

The development of semantically defined curricular resources. When learning analytics is introduced, the whole curricular experience is transformed into one that does not depend on text books but rather on the learner himself indulging in his own learning experience whilst creating learning content in real-time.

4. Adaptive content

Choosing content based on the learner's behaviour and what the system recommends.

5. Adaptive learning

A process that includes social interactions, learning activity and learner support.



**__ Advantages of learning analytics: __**

 * ==== Facilitate and better administrative decision making process and resource allocation ====
 * ==== By monitoring the learner's activities in the LMS/VLE educators intervene and assist the learners to reach success. Moreover they may also identify students who are in danger of dropping out. ====
 * ==== Shared understanding of the college's successes and failures can be made possible. In this way pedagogical approaches and academic models can be enforced or adapted. ====
 * ==== It provides learners with information as regards their own learning habits and can give recommendations for improvements. The allow learners to "compare their own activity… against an anonymous summary of their course peers" (Siemens & Long, 2011). ====

**__ Tracking down the beginning of Learning Analytics __**
One of the first universities to put learning analytics into practice was Purdue University, who launched the Signal's Project in 2007. Efforts to personalise student's experiences through data were also done by Saddleback Community College in Orange County. Along the year several advances were made to the field of learning analytics and a mobile application was also developed by the Kaufmann Labs Education Ventures called the Persistence Plus. This app interacts with the students continuously and gives them a clear idea of their progress individually, as well as compares them to other students and gives them links to external resources they can make use of to progress further. In the late 2012 CourseSmart Analytics was developed. This included a digital textbook provider package.

Presently, at the Austin Peay State University in Arizona, University advisors are using software called the Degree Compass. This software makes use of predictive analytic techniques to aid students decipher the courses they need to take to get their degree and compares them to courses in which they should be successful. This helps advisors lead students to the best career learning path for them.

The Horizon Report 2013, gives several examples of learning analytics being used in higher educational settings. These include:
Further reading: []
 * ==== The Glass Classroom ====
 * ==== jPoll at Griffith University ====
 * ==== Learning Analytics Seminars ====
 * ==== Predictive learning analytics framework ====
 * Stanford University's Multimodal LearningAnalytics

Learning analytics is growing quickly but is still not being used by the majority of educators. The world nowadays is all about data. Huge amounts of data are kept on every individual by different worldwide organisations and entities including Universities. "The promise of learning analytics is actionable data relevant to every tier of the educational system" Horizon Report (2013). Erik Duval is a major in the field of Open Learning Analytics. Nowadays more than ever before, learning is being transformed from a traditional process and moved online, thus becoming more open. Duval is one of the lecturers that made use of MOOCs. MOOC is short for Massive Open Online Courses where people from all around the world are given the opportunity to follow a course online for free. In order to do this all you need is the #tag. In a video called Open Learning Analytics, Duval brings about the reality that we are aiming for. This includes having the web that is full of #tags where people can learn anytime, anywhere and for free. This makes it possible to teach thousands of students all together that wouldn't be possible in the context of a traditional classroom. Moreover, the course can be completed in a shorter period of time by means of online learning, since there is a continuous dissemination of data and students continuously collaborating, sharing as well as producing their own learning. This can be linked with the SCORM standard of the learning objects which involve the use of tags to promote re-usability. This e-learning standard should be implemented in learning analytics for this field to advance.

Duval focuses on the idea that exams should be a means not a goal. Exams should be used to test whether students have understood the concepts that they should have till now and not a determinant of progression. In his courses he makes use of an open badge system where students are given virtual badges according to their achievements. Learning analytics operate on the basis of digital exhaust. This implies capturing all the students activities like what they posts, amount of clicks, duration of participation. This helps in measuring the stuff that they do. Morris, Finnegan, and Wucompared different basic activities related to LMS participation, like the ones mentioned in LMSs and report significant differences between “withdrawers” and “successful completers”. According to Morris et al “time spent on task and frequency of participation are important for successful online learning.” Duval states that "Activity in LMS is only the tip" where very little learning happens here. The concept of learning analytics is much broader. Duval also promotes the use of dashboards in learning. This includes an app that is manipulated by the student himself and looks similar to a dashboard. They give the student the opportunity to view his progress and let him decide himself on what to do in order to improve.



The presently used LMSs don't lead students or support them in what to do by means of options. They just give the students a list of courses that they are following. On the other hand learning analytics can be much more beneficial where they can even be used to monitor student's moods rather than simply quantitative data. Also they can be used in examining the connections of students in a particular course and hence provide an opportunity to cater for the isolated students.



For more information watch the following video by Erik Duval: [] **__ Integrating Learning Analytics in the Physical Environment __** Interactive table that lights up according to how much you talk when discussing or working as a group. The more you talk the more it lights up so it gives you a feedback on how much you are talking. Studies show that this helps you to regulate your behaviour.




 * __ Conclusion __**

Whilst learning analytics are extremely powerful in learning they sometimes are regarded as being too deterministic, where they assume that future conditions can be determined with knowledge of the past and the present of the person involved. In education this is certainly not the case. Another issue would be how transparent are the algorithms and weighting of algorithms and how real-time should analytics be in the classroom setting. Learning analytics are said to base a lot on behavioural theories of learning. This makes it difficult to account for other than behavioural data. Despite the latter, learning analytics are said to be effective in bridging the link between all education stakeholders, especially educators and students. Even though they are still being developed and a lot of work needs to still be done in their field, learning analytics have the ability to "penetrate the fog of uncertainty around how to allocate resources, develop competitive advantages, and most important, improve the quality and value of the learning experience" (Siemens & Long, 2011).

//__ References __//

1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, February 27–March 1, 2011, as cited in George Siemens and Phil Long, "Penetrating the Fog: Analytics in Learning and Education," EDUCAUSE Review, vol. 46, no. 5 (September/October 2011).

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. Retrieved October 25, 2013, from http://www.elmhurst.edu/~richs/EC/OnlineMaterials/SPS102/Teaching%20and%20Learning/Penetrating%20the%20Fog.pdf

NMC Horizon Report 2013 Higher Education Edition. Retrieved October 26, 2013, from http://net.educause.edu/ir/library/pdf/HR2013.pdf

Duval, E. (2013, June 18). Open Learning Analytics: Erik Duval at TEDxUHowest. Retrieved October 27, 2013, from http://www.youtube.com/watch?v=LfXDzpTnvqY

Libby V. Morris, Catherine Finnegan, and Sz-Shyan Wu, “Tracking Student Behavior, Persistence, and Achievement in Online Courses,” The Internet and Higher Education, vol. 8, no. 3 (2005), pp. 221–231.