Digital Media & Copyright in Higher Ed: Reduce Risk & Avoid Liability (Higher Ed Hero Webinar)

Register to attend: http://bit.ly/Copyright-Oct32014
Date: October 3, 2014
Time: 1:00 PM – 2:00 PM
Location: GEB A304

Overview:

The use of music, movies, and digitized textbooks on college campuses in and out of the classroom are an everyday occurrence. But are these resources utilized in a manner that do not infringe on copyright laws? When you can and can’t apply fair use for educational purposes is confusing. There are lessons to be learned from recent rulings on fair use that will help faculty and staff navigate copyright laws and avoid litigation and fines.

Join us for this 60-minute webinar where you will discover:

  • Clear instructions for making fair use determinations
  • What are your rights for copying DVD’s, CD’s, video clips & more
  • Online & hybrid courses: Tips for streaming video outside the classroom
  • How the internet changes fair use – exceptions & exclusions
  • Lessons learned from Georgia State, the Hathi Trust decision & others

AIHC Webinar – Measures and Results from an Ambulatory, Interprofessional Team OSCE Project

Register to attend: http://bit.ly/AIHC-Sep302014
Date: September 30, 2014
Time: 1:00 PM – 2:00 PM
Location: GEB A204

PresentersSheree Aston, OD, PhD (Vice Provost, Western University of Health Sciences) and David Dickter, PhD (Director, Interprofessional Education Research and Strategic Assessment, Western University of Health Sciences)

Session Overview:
This American Interprofessional Health Collaborative (AIHC) webinar focuses on a five-year project with the California Geriatric Education Center at UCLA and funded by the Health Resources and Services Administration (HRSA), that assisted Western University of Health Sciences in developing the Ambulatory Team Observed Structured Clinical Evaluation (ATOSCE).

This presentation will provide examples from the ATOSCE performance assessment tool, reliability findings and preliminary findings of performance levels among groups of students. As part of the HRSA grant, a follow-up toolkit will be made available for interested educators and researchers.

Using Cloud-based Applications to Support Learning Objectives: BLOOMing with Technology

The Sloan Consortium hosted this webinar that provided an overview of the digital makeover that Bloom’s taxonomy of educational objectives for learning has undergone. The Bloom’s Digital Taxonomy Pyramid makes thinking about technology tools in this context a breeze. This session explored how they can be used at various levels on the digital taxonomy to encourage higher level thinking and problem solving. This session also provided faculty with some creative and innovative ideas for integrating Web 2.0 tools at each level.

 

Introduction to Bloom’s Digital Taxonomy is located here.

Interactive Bloom’s Revised Digital Taxonomy – one version located here.

APPLYING

http://bubbl.us – Anatomy & Physiology students collaborate in a course using this free resource. Students are assigned specific areas – and areas are left blank – which require students to insert specific information (such as the anatomical part and the physiology of it).

http://www.twitter.com – Using Twitter as a public service announcement. Students must learn how to use Twitter and compose a PSA in 140 characters or less. Screenings (cholesterol, cancer, diabetes, prostate, etc…) are assigned to students. They then had to use Twitter to indicate why it was important to do a screening.

http://www.letterpop.com – great way for students to compose a newsletter and demonstrate they understand the information

Voki or VoiceBoards (integrated in some versions of Blackboard) or AudioBoo –  – use this for medical terminology. Students submit all their assignments using one of the three tools. In a survey, 96% said they wanted to have this used more, because they were able to hear the instructor pronounce the word.

Screencast-o-matic – can do presenting and show their work. Similar to Jing. Free to use, but for $15 can do closed captioning. Similar to using Snag-It or Camtasia. In anatomy/physiology have screenshots and students are required to go in and indicate where the origin of the problem was.

ANALYZING

Google Docs

Create-a-Graph or other infograph tool

RSOE Emergency and Disaster Information Service located here.

Vizualize.me

 

Webinar facilitator: Dr. Julia VanderMolen is the Department Coordinator and Assistant Professor for Science and Health Online at Davenport University, Grand Rapids, Michigan. She is a 2011 Teaching Excellence Award and 2012 Blackboard Exemplary Course Award winner. She has expertise in online learning and currently provides expertise to the International Association for K-12 Online Learning (iNACOL) as a member of the research committee for best teacher education practice. She graduated with a Ph.D in Educational Leadership with an emphasis in Career and Technical Education from Western Michigan University. She has a M.Ed in Educational Technology from Grand Valley State University and a MA in Health Science from the University of Alabama. She has presented at a number of conferences on the topic of educational technology and online learning.

 

Riding the Storm: Improving Course Performance/Interaction through Analytics and Proactive Methods of Engagement

Today’s Friday Focus on e-Learning was a replay of a a session from the 18th annual Sloan Consortium Conference on Online Learning held in October 2012. The presenter was John Vivolo, Manager of Online Learning, Polytechnic Institute of New York University.

Most learning management systems (LMSs) can generate numbers that can be placed in spreadsheets to proactively make alterations in the course (as the course is taking place).

Reactive – make changes to a course AFTER poor student performance on an assignment/exam. Student evaluations are an example of reactive. Academic department evaluations are also typically reactive (done at end of semester).

Proactive – set up preventive measures prior to an assignment/exam. Time-based, Individual assignments/content, Discussion boards.

Interaction: student-student, student-faculty, student activities.

1. Time-based activity – days of the week, times of day, days of month. Can influence improvements in content availability, assignment due dates, virtual office hours, etc… When are students going in and doing activities in the course?

What is your goal?

  • To accommodate student schedules? To reach the class at peak activity? Solution: Make content available at the peak of activity or right beforehand.
  • To create an equal distribution of activity through the week? Solution: Make content available at the lowest point of activity. Global solution: Create an “Interaction policy” (proactive). This policy would refer to student & faculty interactions, as well as when students are expected to go in to get various items from the course (or complete specific activities).

2. Individual Assignments/Content: Hits – track how often students view a content item. Goal: Avoid a wait and see approach.

How to use the numbers of how many times students clicked on a podcast? If they are clicking on it a lot (far more than the # of students in the class) then it could indicate that they are having trouble with the content (don’t understand content).

If they’re clicking on content a LOT, then before moving on (or before the next exam) provide:

  • discussion board Q& A
  • create a review sheet
  • host a review webinar
  • create a non-graded quiz

If one student seems to be accessing the content over and above others, reach out to that specific student to see if s/he has questions.

3. Discussion boards – the most commonly used interactive tool. Purpose: simulate an in-class discussion but in an asynchronous method.

Common discussion board goal – create a discussion that remains a fluid conversation over the week.

Can create discussion board interaction policy – respond to initial question, post at least once before X day, may have more than one topic or thread, etc…

Alerts can be set in Blackboard:

  • attendance alerts – students/faculty don’t access course in X amount of time
  • assignment/content alerts – student does not access content or assignment
  • due date alerts – student has not submitted assignment (before/after) due date

Managers/Directors

  • can use analytics for online faculty oversight
  • data can be collected for faculty – are they interaction; what and how are they interacting

Q&A at end of session:

  1. Do faculty feed the analytics back to the students – “this is what I’m seeing” – Presenter recommends NOT to feed the analytics back to the students. He believes students will feel as if they are being watched.
  2. Who runs the analytics reports? The Manager of Online Learning runs them, distributes to faculty, and then faculty do what they will with the information.
  3. Is the Interaction Policy something that is set centrally (overall), by college or department or faculty? It’s variable.

How do YOU use analytics to assess performance and/or adjust your courses?

Curriculum Mapping in an Age of Competency-based Education

This International Association of Medical Science Educators (IAMSE) webinar was led by Sascha Benjamin Cohen and Chandler Mayfield. They were instrumental in the development of Ilios as a way to enhance their curriculum management project. Approximately 5 years ago Ilios was reconceptualized and made available for all health professions education/curriculum.

NOTE TO UTHSC READERS – The slides for the session were very good. Contact Cindy Russell from your uthsc.edu email address if you’d like a link to those slides. Since we are an institutional subscriber to the IAMSE webinar series we can share this information internally to UTHSC, but not to external individuals.

Visit http://iliosproject.org to learn more about Ilios.

A good point was made about Noise vs Data – and how a tendency among many of their faculty is to try and create relationships between and among a lot of the program and course data/outcomes. It is a continuing dialectic to help faculty understand the power within the system that allows it to be clearer and more expressive.

They have found points where the conceptual map corresponds with specific variables. They then put a constraint in at that point. Within a given course they don’t put in constraints as they assume there is a multiplicity of mechanisms for any given student to traverse and gain that outcome.

There is a need for strong curriculum governance in all of this. Use the data, get in front of the faculty, get buy-in across the curriculum with how the data are used. There’s a lot of business engineering processes that need to be used/planned for outside of Ilios. Any objectives are testable – so can’t put an objective on something if it won’t be ready to be assessed. Discussions about number of objectives – for a course is 62 enough or not enough?

The Medbiquitous Consortium is working to develop technology standards for health professions education. Check out their resources and site.

Ilios does not manage assessments and outcomes. Ilios is a curriculum engine. Assessments and outcomes come FROM the curriculum and are not part of the Ilios system. Reporting is done by combining outcomes with the curricular information. Ilios drives the activities VS capturing the activities.

Next Generation Learning: What is it? And will it work?

Today’s Friday Focus on e-Learning is a replay of the 2013 EDUCAUSE Learning Initiative (ELI) session from 2/5/13. Dr. Barbara Means, Director of the Center for Technology in Learning @ SRI, an educational psychologist, is the presenter.

What is NGL?

  • NGL better prepares students for a world that values and rewards deeper learning, collaboration, skilled communication, self-management, the ability to work across disciplines, and innovation practices.
  • NGL meets each student where s/he is and provides content, pedagogy, & access opportunities to meet individual needs.
  • NGL capitalizes on affordance of technology for learning.
  • NGL collects detailed data about the process of learning that can be used to diagnose student needs, and provide feedback to the instructional developer.

Challenge areas:

  1. Deeper learning – richly interactive technologies that increase student engagement and learning of conceptual content and 21st century skills. Example: U of Wisconsin-Milwaukee’s U-Pace – self-paced intro psych course; mastery based, shorter modules & end of module quizzes; timely & tailored feedback.
  2. Blended learning – combinations of online and teacher-led instruction to improve learning, increase completion, and lower costs. Example: Cal State U Northridge – redesigned gateway math course as hybrid alternative to conventional college algebra.
  3. Open Core Courseware – high-quality, modular, openly licensed course-ware for developmental, gateway, & high-enrollment core courses. Example: Cerritos College’s Project Kaleidoscope – 12 different OER courses implemented on 9 campuses.
  4. Learning Analytics – software for collection, analysis, & real-time use of learning data by students, instructors, & advisors to improve student success. Example: Marist College’s Open Academic Analytics Initiative.

What was learned:

  • Most Wave 1 innovations didn’t really have evidence of effectiveness before the grants began.
  • Many technology components weren’t completely developed before the grants started.
  • The most commonly reported difficulties were technology problems followed by student resistance. Students often didn’t have comfort being in charge of their learning.

Broader implications of the data:

  • There are campus impediments to a fast start.
  • Many faculty volunteer to try out new learning technologies and they typically respond more positively to innovations.
  • There are few online and blended learning initiatives set up to collect rigorous evidence of the innovation’s impact on students.

Barriers to collecting rigorous evidence:

  • Campus policies or IRBs may prohibit assigning students to courses with significant online components at random.
  • Some campus research offices weren’t willing to release student-level data.
  • Different instructors often don’t want to administer the same assessment.
  • Valid, reliable assessments weren’t readily available for many of the projects’ learning objectives.

U-Pace project Outcomes:

  • compared 230 students in U-Pace psychology to 334 students in conventional psychology course
  • positive effects on % of students earning an A or B (ES = +.96) and course completion (ES = +.35)

Cal State Univ Northridge outcomes:

  • compared 4,479 who took the hybrid course to 1,825 students from past courses
  • again, large positive effect sizes

The presenter went on to describe MOOCs and how the features of those course delivery models fit with or vary across different platforms.

Want to hear all of this session yourself while you’re at your own computer? Let Cindy Russell know and you can obtain the login to watch it at your place and time.

 

What’s the Horizon Report and What’s it mean to me?

Replay of the recording of the 2013 ELI 2013 session where the Horizon Report was first released.

Released on 2/4/13, the NMC (New Media Consortium’s) Horizon Report, Higher Education Edition, is an annual “unbiased source of information that helps education leaders, trustees, policy makers, and others easily understand the impact of key emerging technologies on education, and when they are likely to enter the mainstream.” This is the 10th annual edition.

The Horizon Report is about LEARNING.

Time to adoption horizon:

  • One Year or Less:
    • Massively Open Online Courses (MOOCs)
    • Tablet Computing
  • Two to Three Years:
    • Games and Gamification
    • Learning Analytics
  • Four to Five Years:
    • 3D Printing
    • Wearable Technology

 

Key emerging trends:

  1. Openness – concepts like open content, open data, and open resources, along with notions of transparency and easy access to data and information – is becoming a value.
  2. MOOCs – weren’t even on last year’s report; but today are on the near-term list.
  3. The workforce demands skills from college grads that are more often acquired from informal learning experiences than in universities.
  4. There is increasing interest in using new sources of data for personalizing the learning experience and for performance measurement (learning analytics).
  5.  The role of educators continues to change due to the vast resources that are accessible to students via the Internet.
  6. Education paradigms are shifting to include online learning, hybrid learning, and collaborative models.

Significant challenges limit the transition to the emerging trends. We seem to be playing catchup a lot these days.

  1. Faculty training still does not acknowledge the fact that digital media literacy continues its rise to importance as a key skill in every discipline and profession.
  2. The emergency of new scholarly forms of authoring, publishing, and researching outpace sufficient and scalable modes of assessment.
  3. Too often it is education’s own processes and practices that limit broader uptake of new technologies.
  4. The demand for personalized learning is not adequately supported by current technology or practices.
  5. New models of education are bringing unprecedented competition to the traditional models of higher education.
  6. Most academics are not using new technologies for learning and teaching, nor for organizing their own research.

Want to get a head start on knowing what’s coming up for the next Horizon Report?

  • Into Twitter? Use the hashtag #NMCHz to stay in the know and get a steady stream of resources.
  • Mobile? Get the app for HZ News (iOS and Android)
  • Want to keep up with the advisory board’s work during the year? Log onto horizon.wiki.nmc.org
NMC Horizon Report 2013 Higher Education Edition

NMC Horizon Report 2013 Higher Education Edition

Evaluating Class Size in Online Education

The American Association of Colleges of Nursing (AACN) sponsored this webinar, facilitated by Dr. Susan Taft of the Kent State University College of Nursing in Kent, OH. Taft co-authored “A Framework for Evaluating Class Size in Online Education” that was published in The Quarterly Review of Distance Education, 12(3), 2011, 181-197.

The bad news … there is NO one size fits all for determining optimal class sizes for online courses. The good news … there ARE guidelines for determining optimal class size.

Optimal class size is defined as healthy revenue generation PLUS desirable student learning outcomes.

Variables (factors) associated with workload in teaching online courses include:

  • faculty experience with distance education
  • the level at which the course is offered – graduate or undergraduate
  • content to be covered and course design
  • size of the class
  • online platform used, and presence or absence of technology support and/or teaching assistants
  • the mode of instruction (e.g. whether strictly web-based or combined with other modes of instruction)

Taft went on to review three educational frameworks that provide guidelines:

  1. The Objectivist-Constructivist Continuum
  2. Bloom’s Taxonomy
  3. The Community of Inquiry Model

Class sizes on the objectivist-constructivist dimension:

  • objectivist – largely one-way communication – can be large class size
  • mix of objectivist-constructivist – medium teaching intensity – medium size 20+
  • constructivist – interactive with higher teaching intensity – < 20 students

Class sizes and Bloom’s Taxonomy dimensions:

  • upper levels of taxonomy – analysis, synthesis, evaluation – small class size < 15
  • middle of taxonomy – application – medium teaching intensity – 16-40 students
  • lower levels of taxonomy – knowledge, comprehension – lower teaching intensity – > 30+ students

The Community of Inquiry Model is the more complex of the three models. Three types of presence are recommended for online courses:

  1. Teaching presence (faculty)
    1. course design & organization
    2. facilitating discourse – this may or may not be used
    3. direct instruction – may be fully or partially used
  2. Cognitive presence (students) – may or may not be fully required
  3. Social presence (faculty & students) – faculty being a “real person” in the online environment; may or may not be present
  • With the Community of Inquiry model, partial teaching presence that is associated with lower teaching intensity can have a class size of 25+.
  • With full teaching presence, cognitive presence, and social presence, there is higher teaching intensity and smaller class sizes of < 20 students.

Using the objectivist-constructivist continuum + Bloom’s Taxonomy leads to a more objective and quicker determination of class size. When Community Of Inquiry model is considered, the complexity of judging appropriate class size increases.

Examples of class size determinations considering combinations of all 3 frameworks:

  • Use of objectivist teaching methods, lower levels of Bloom’s Taxonomy, limited implementation of COI – class size can be large, > 30 students
  • Constructivist methods, higher levels of Bloom, and full use of COI model – class size should be small < 15 students

For courses, determine how much the faculty member needs to be present and in the center to help students learn. Much of this determination (80%) can be discerned from the syllabus (as long as it’s a good syllabus). For the additional 20%, need to review online workload – are faculty facilitating good meaty discussions among the students, are faculty grading online discussions.

A truism – most faculty see their specific course as the highest intensity, requiring the highest workload level. In most cases, it is not true. So the administrator needs to review across all faculty. Develop guidelines for courses offered during this particular semester will have this many students allowed into them. Develop guidelines for different levels in the program. RN-BSN courses should have between 20-40 students in each section so that faculty can grade papers and give effective feedback to students.

Synchronous can add an additional level of teaching intensity – because faculty need command of the tools to make them work and because synchronous sessions tend to generate questions and issues that the faculty need to follow up on. Synchronous teaching should add in to the faculty workload.

Discussion of the Quality Matters Program – it is great for structure of a course, but it doesn’t address process and outcomes. The Sloan Consortium quality scorecard is a better model for online work, according to the presenter.

In addition to the presenter’s article (referred to above) some accessible online resources related to determining optimal class size in online education are:

REMINDER: UTHSC is an institutional member of the Sloan Consortium, which enables faculty and staff to obtain important and relevant materials related to online education. Contact Cindy Russell for details.

For follow-up material or discussions related to this or other topics, contact Cindy Russell at crussell@uthsc.edu or 901-448-6158.

UTHSC and Sloan-C Virtual Conference

UTHSC is a virtual attendee for the 18th Annual Sloan Consortium International Conference on Online Learning, running from October 10-12, 2012. The conference will provide the latest information on asynchronous learning programs, processes, packages, and protocols. It’s geared to both experienced professionals and interested newcomers.

Here’s how you can participate:

  • Join campus colleagues at one or more sessions over the 3 days. See below for the schedule and location, including the specific session to be shown (hyperlinked so you can read more about the session).
  • Join colleagues in your colleges who have individual logins – contact Cindy Russell at crussell@uthsc.edu or 901-448-6158 for a list of individuals in your college.
  • Follow the Sloan-C Conference social networking on Facebook, Twitter (#aln12 hashtag) and LinkedIn.
  • Remember that you can follow UTHSC on our social networking sites for updates as several individuals participate in the virtual conference. Find us on Facebook, Twitter, and our blog.

UTHSC is an institutional member of Sloan-C and that comes with loads of benefits. Read about all the benefits of Sloan-C membership. Contact Cindy Russell at crussell@uthsc.edu or 901-448-6158 for instructions on taking advantage of this great benefit.

Wednesday, October 10 in Hyman 101

Thursday, October 11 in Hyman 407

Friday, October 12 in GEB A304

Asynchronous strategies are becoming a more important part of our teaching tools. Attend all or part of this virtual conference to make sure you’re up to date on best practices and know what’s new and useful.