- Lea's Blog

 

Guest editors: Dragan Gaševiæ, Carolyn P. Rosé, and George Siemens

 *** Submission deadline: February 1, 2016 ***

 

The availability of digital data affords unprecedented possibilities for analysis of different aspects of human life. These possibilities have mobilized researchers, practitioners, institutional leaders, policy makers, and technology vendors to look at the ways these data can be used to understand and enhance learning and teaching. This intensive interest has given rise to the formation of the new field of learning analytics. According to the Society for Learning Analytics Research (SoLAR: http://solaresearch.org/), learning analytics is defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.” The field of learning analytics builds on and links to numerous other disciplines including learning sciences, social network analysis,

(educational) data mining, language technologies, machine learning, information visualization, computational thinking and sense-making, (educational, social, cognitive, organizational) psychology, and educational theory.

 

Early results in the field of learning analytics offer much promise in identification of learners at risk of failing or dropping out of a course, understanding of information flow in social interactions, or identification of different cognitive, metacognitive and affective states in discourse and other behavior traces. These results have attracted many institutions to invest in systemic implementation of learning analytics, development of relevant institutional policies, and creation of partnerships with organizations specializing in learning analytics. Data about user interactions with learning technologies are the main resources used in learning analytics. Given the early days in the development of the field of learning analytics, research is necessary to provide more effective methods in order to understand the factors that influence learning as well as to optimize learning outcomes and processes across cognitive, metacognitive, social, and affective dimensions. Moreover, there is a need for theoretically sound and empirically validated frameworks for:

 

- presentation of results of learning analytics to a variety of

   stakeholders, including decision makers, administrators, instructors

   and students,

 

- development of novel learning technologies offering effective ways

   for personalization support and continuous improvement,

 

- systemic institution-wide deployment and implementation of learning

   analytics, and

 

- ethical and safe use of data and learning analytics that protects

   user privacy while maximizing wellbeing.

 

This special issue of IEEE Transactions on Learning Technologies calls for papers that address the above gaps by reporting on a combination of theoretical/conceptual and empirically validated findings. The accepted papers will contribute to the existing body of research knowledge in the field of learning analytics and will offer a sound empirical base that can motivate and inform practice. The submissions that build bridges of learning analytics with other related disciplines to enhance the impact are especially welcome.

 

Important Dates

---------------------------------------------------------------------------

 

- Full manuscripts due: February 1, 2016

- Completion of first review round: June 1, 2016

- Revised manuscripts due: July 31, 2016

- Final decision notification: October 31, 2016

- Publication materials due: November 30, 2016

- Publication of special issue: early 2017 (possibly the Jan-Mar 2017

   issue, i.e., vol. 10, no. 1)

 

Submission and Review Process

---------------------------------------------------------------------------

 

Full manuscripts should be prepared in accordance with the IEEE Transactions on Learning Technologies guidelines

(http://www.computer.org/portal/web/tlt/author) and submitted via the journal's ScholarOne portal (https://mc.manuscriptcentral.com/tlt-cs),

making sure to select the relevant special issue name.

 

Manuscripts should not be published or currently submitted for publication elsewhere. Only full papers intended for review, not abstracts, should be submitted via the ScholarOne portal. Each full manuscript will be subjected to peer review.

 

## Guest Editors' Details

 

- Dragan Gaševiæ

   Professor and Chair in Learning Analytics and Informatics,

   University of Edinburgh

   President, Society for Learning Analytics Research

 

- Carolyn Penstein Rosé

   Associate Professor, Carnegie Mellon University

   President, International Society of the Learning Sciences

 

- George Siemens

   Professor and Executive Director, University of Texas at Arlington

   Founding President, Society for Learning Analytics Research

 

Contact: ieeetlt.la@gmail.com

 

**********************************************************************

 

Lea in the Box
10/28/15

What is Direct Assessment?

 

Learning by practice and performance sounds very logical if the students are expected to acquire skills and competencies that are clearly defined. With growing demand, especially of parents, on cross-cutting skills — such as critical thinking, problem-solving, communication and collaboration, some schools (esp. in USA) are trying to employ new educational methods such as experiential learning, project-based learning or flip learning.  Adoption of these methods however, is not national policies yet. On the other hand, certain pockets of education, such as “medical training” are obliged to use intensive practice for learning. Community Colleges in USA or polytechnic schools in Europe are 2 or 4 year higher education institutions, which are mostly for vocational training, for those who wants to excel in their current professions or those who decides to acquire new professions. Unlike medical schools, these institutions need to expand experiential learning across many degree programmes. As early as 1970s some of these schools had efforts to take up an outcomes based approach called “Competency-based Education” (CBE). However, it is during the last few years that CBE began to be a popular (in USA) methodology which is officially recognized.

 

 

 


US Department Education inspects submitted CBE programs and approves them based on published guidelines and includes them among a growing network of schools which credit coursework based on “Direct Assessment.” In fact, what the government approves is actually the assessment plan. Direct assessment is so central to CBE, that often times the degree offer is called “Direct Assessment Programme.”


 


In CBE method, “transfer of knowledge” happens outside the classroom. The school is “agnostic” to content but provides a select set of resources that are linked to the clearly defined competencies. Students are free to use various resources online, usually supported by a personalized learning software to organize their study. The faculty then, facilitates and guides performance activities, provides timely and meaningful feedback, and runs direct assessment throughout. As their learning translates to real-life, students gain mastery of the competencies in a way that they can immediately be productive upon degree completion. CBE curriculum is data driven and students commonly use learning analytics software such as Acrobatiq or eLumen to self-monitor their performance.


Blunt methods such as standardized tests are not fit for measuring competency. Since receiving the course credits is possible only with direct assessment, it is essential to develop a normative new method for direct assessment. One candidate method is rubric-based assessment. Association of American Colleges & Universities has launched in 2011 a multi-state initiative to advance learning outcomes assessment using normative rubrics. 59 Institutions have collaborated to develop rubrics for 16 intellectual and practical skills such as information literacy, critical thinking, and integrative learning. Faculty members were trained and through pilot studies, the rubrics were tested on campuses and rewritten three times before reaching a final version.

The results of pilots on rubric-based assessment are very encouraging. Not only student engagement, course completion rates and mastery demonstration are substantially improved, but also participant professors claimed it improved their teaching as indicated clearly by the responses they got from their students.

Supported by strong foundations such as Lumia, Gatesand the Association of American Colleges and Universities, CBE is now expanding to K12 education.

Lea in the Box
10/28/15

Talente Regional!

Talent Regional ist ein Förderprogramm der Österreichischen Forschungsförderungsgesellschaft FFG, finanziert durch das Bundesministerium für Verkehr, Innovation und Technik (bmvit). Ziel des Programms ist es Talente in Österreichs Schulen zu entdecken, diese zu inspirieren und für Forschung und Technik (in den sogenannten STEM Fächern) zu begeistern. Dabei sollen Schulen, Forschungspartner und Unternehmen zusammengebracht werden um gemeinsam Forschung und Innovation zu betreiben. Michael Kickmeier-Rust, Lea’s Box Koordinator besuchte die Netzwerkveranstaltung und hat Kontakte für Projektinitiativen im Schuljahr 2016/17 geknüpft. Ziel ist es den Schülern und Schülerinnen Forschung im Bereich Lernen näher zu bringen, als genau den Bereich der sie wohl am engsten betrifft. Mehr Informationen gibt’s unter https://www.ffg.at/content/netzwerkveranstaltung-talente-regional.

Measuring and Visualizing Learning in the Information-Rich Classroom collects research on the implementation of classroom assessment techniques in technology-enhanced learning environments. Building on research conducted by a multinational and multidisciplinary team of learning technology experts, and specialists from around the globe, this book addresses these discrepancies. With contributions from major researchers in education technology, testing and assessment, and education psychology, this book contributes to a holistic approach for building the information infrastructure of the 21st Century school.

Among the book's editors are Lea's Susan Bull and Michael Kickmeier-Rust!

https://www.routledge.com/products/9781138021136

The LEA’s Box ‘open learner model’ allows learners and teachers to view visualisations of individuals’ competencies, inferred according to their online interactions, or data provided by the teacher. This is, in itself, exciting. However, now learners can also take part in a limited form of discussion about their competencies. This allows them to try to get LEA’s Box to update their learning data (for example, if they have just had a lesson and so now understand more), and it helps them to focus on, and think about their learning. If they try to increase a value for any of their competencies, they will have to justify their reasons according to statements that the teacher has provided for them (for example, ‘I did well in my homework’ or ‘I scored over 70% on the last quiz’). They may even give new reasons, if none of the existing ones are appropriate. This will help the teacher to add to the set of available justifications, and it will help students to identify which activities are most helpful in improving their competencies. Of course, LEA’s Box will not just allow students to make unacceptably big changes. The teacher defines the maximum change possible (up or down), and LEA’s Box may offer a compromise value if the learner is trying to make a change outside the threshold defined by the teacher. The first deployment of this discussion component is about to happen with university students who are learning Italian. We hope that it will soon be in schools, too.

1 2 3 4 5 7 9 10 11 12 13

Lea's Learning Analytics Blog

Learning analytics, educational data mining, formative assessment - all recent buzz words in educational research. In principle, the idea is to find theoretical frameworks, models, procedures, and smart tools to collect, aggregate, analyze, reason on and visualize large scale educational data. LEA’s BOX is a research and development project funded by the European Commission. The project aims at (a) making educational assessment and appraisal more goal-oriented, proactive, and beneficial for students, and (b) at enabling formative support of teachers and other educational stakeholders on a solid basis of a wide range of information about learners. That means, LEA’s BOX is a learning analytics toolbox that is intended to enable educators to perform competence-centered, multi-source learning analytics. More info at http://www.leas-box.eu!

Search

open source blog