Showing posts with label camera. Show all posts
Showing posts with label camera. Show all posts

Tuesday, November 5, 2019

EduSense: Like a FitBit for your teaching skills

reposted from

NEWS RELEASE 

EduSense: Like a FitBit for your teaching skills

CMU researchers develop comprehensive classroom sensing system
CARNEGIE MELLON UNIVERSITY
IMAGE
IMAGE: EDUSENSE IS A COMPREHENSIVE CLASSROOM SENSING SYSTEM THAT PROVIDES INSTRUCTORS DETAILED DATA ON THEIR OWN TEACHING AND STUDENT ENGAGEMENT. view more 
CREDIT: CARNEGIE MELLON UNIVERSITY
While training and feedback opportunities abound for K-12 educators, the same can't be said for instructors in higher education. Currently, the most effective mechanism for professional development is for an expert to observe a lecture and provide personalized feedback. But a new system developed by Carnegie Mellon University researchers offers a comprehensive real-time sensing system that is inexpensive and scalable to create a continuous feedback loop for the instructor.
The system, called EduSense, analyzes a variety of visual and audio features that correlate with effective instruction. "Today, the teacher acts as the sensor in the classroom, but that's not scalable," said Chris Harrison, assistant professor in CMU's Human-Computer Interaction Institute (HCII). Harrison said classroom sizes have ballooned in recent decades, and it's difficult to lecture and be effective in large or auditorium-style classes.
EduSense is minimally obtrusive. It uses two wall-mounted cameras -- one facing students and one facing the instructor. It senses things such as students' posture to determine their engagement, and how much time instructors pause before calling on a student. "These are codified things that educational practitioners have known as best practices for decades," Harrison said.
A single off-the-shelf camera can view everyone in the classroom and automatically identify information such as where students are looking, how often they're raising their hands and if the instructor moves through the space instead of staying behind a podium. The system uses OpenPose, another CMU project, to extract body position. "With advances in computer vision and machine learning, we can now provide insights that would take days if not months to get with manual observation," said Karan Ahuja, a member of the research team who is pursuing his Ph.D. in the HCII.
Harrison said learning scientists are interested in the instructional data. "Because we can track the body, it's like wearing a suit of accelerometers. We know how much you're turning your head and moving your hands. It's like you're wearing a virtual motion-capture system while you're teaching."
Using high-resolution cameras steaming 4K video for many classes at once is a "computational nightmare," Harrison said. To keep up, resources are elastically assigned to provide the best possible frame rate for real-time data.
The project also has a strong focus on privacy protection, guided by Yuvraj Agarwal, an associate professor in the university's Institute for Software Research. The team didn't want to identify individual students, and EduSense can't. No names or identifying information is used, and since camera data is processed in real time, it is discarded quickly.
Now that the team has demonstrated that they can capture the data, HCII faculty member Amy Ogan said their current challenge is wrapping it up and presenting it in a way that's educationally effective. The team will continue working on instructor-facing apps to see if professors can integrate the feedback into practice. "We have been focused on understanding how, when and where to best present feedback based on this data so that it is meaningful and useful to instructors to help them improve their practice," she said.
This research has been presented at Ubicomp, the International Conference of the Learning Sciences, and will be presented this coming April at the American Educational Research Association annual meeting.

Thursday, August 8, 2013

Use Kinect with Mac OSX

reposted from


Sunday, 30 December 2012 23:18

Use Kinect with Mac OSX

Use Kinect with Mac OSX
In this article I will show how you can use Microsoft Kinect for xBox 360 on Mac OSX. In this way you can create applications that interact with the body's movements.

Introduction

This article is intended for people who have a lot of experience in the Information Technology area, both as a developer and as systems engineer, especially on unix systems. In fact, the installation of the drivers may be a little tricky, especially if something does not go the first time.
I warn you... there are some commands to run with the terminal, I do not take any responsibility if with these commands (or connecting the kinect) you will damage your Mac. However, if you are familiar with the shell (and unix systems) you should not have problems.
The version of Kinect that i have is sold separately from the xBox. It has the power and the USB adapter included in the package. If you have the version bundled with the latest xBox (that doesn't have the adapter with the power supply), you will need this device from the Amazon website.
I connected the Kinect to an iMac with OSX 10.7.4 64-bit. However, if you read the whole article you'll be able to adapt the installation process on different systems without much effort.
Well, now that I have described the tools used for testing, we can install and configure the software and drivers required.

Driver and SDK

Before proceeding you should know that there are several available API and SDK for Kinect. However, the two main SDK are OpenNI and OpenKinect. The first is maintained by the company PrimeSence who developed the technology behind the Kinect, OpenKinect instead is a group of people who formed a community called OpenKinect which issued the library libfreenect.
There is also the official SDK released by Microsoft, but unfortunately it only works on Windows operating systems. I will analyze that in the next articles.
For this tutorial I used OpenNI. This is because that SDK is developed by the company that has created the technology behind the Kinect, and it comes with many examples.
I used a wrapper for Processing (a programming language based on Java) called Simple OpenNI.
Simple OpenNI is a very good project, but I had to make a number of changes to the installation process to adapt it to the version of Mac OSX that I have. All changes are listed below.

Let's start

First of all, open the page with the official procedure to install Simple OpenNI on Mac OSX clicking hereAs described above I had to slightly change the installation procedure. Here's how:
  1. I download xcode 4 from here. Then I have installed it.
  2. I download the version 2.1.2 of MacPorts (the version that is shown in the installation guide is too old) from here. Then I have installed it.
  3. I download Java JDK 7 (for Mac OS X x64) from here.
  4. I have opened a terminal and ran the command: sudo port install git-core
    if that command returns the following error:
    Unable to open port: can’t read "build.cmd": Failed to locate ‘make’ in path: '/opt/local/bin:/opt/local/sbin:/bin:/sbin:/usr/bin:/usr/sbin' or at its MacPorts configuration time location, did you move it?
    then you have to start xcode,  go to Preferences -> Download and install the "Command Line Tools", then you can try again the command sudo port install git-core.
  5. Again from terminal I ran the following command: sudo port install libtool
  6. Finally, this: sudo port install libusb-devel +universal
    if that command returns the following error: Please do not install this port since it has been replaced by 'libusb' then you have to run the following commands:
    1. sudo rm -f /opt/local/lib/libusb-1.0.0.dylib
    2. sudo port clean libusb
    3. sudo port install libusb +universal
  7. I download OpenNI_NITE_Installer-OSX from here, I have unzipped it and then ran (inside the unzipped folder) the command sudo ./install.sh
    This command will install the driver to allow the proper functioning of Kinect.
  8. I download and installed processing 2.0 for Mac OSX from here. This is because the version 1.5.x has some problem with the latest versions of Mac OSX. We have to do few changes into the source code of the examples provided with Simple OpenNI. Later I'll show you how to do that.
  9. Now we need to download the library SimpleOpenNI for processing from here. I have unzipped 'SimpleOpenNI.zip'. In this way we get a folder called SimpleOpenNI. You have to copy the folder SimpleOpenNI into the directory /Users/'your username'/Documents/Processing/libraries. If you don't have this folder then create it. The result is shown in the screenshot below.
The folder SimpleOpenNI also contains several examples that show how to use kinect, of course all written in Processing.

The Code

Connect the Kinect to the wall socket and to the iMac, open Processing and a sample file from /Users/'your username'/Documents/Processing/libraries/SimpleOpenNI/examples/OpenNI. Open the example fileUser3d.pde from the folder User3d.
If we run the code (nb: you must stand at least 1.5 meters from kinect) we should obtain an image like the one below. You can see a sort of skeleton that follows the movements of my body:
skeleton
The User3d.pde file (such as those present in the samples folder) is well commented, if you read the code(and the comments), you can learn how to customize the code or create new programs.
You have to make some changes to the source code before try other sample files. You should know thatProcessing 2.0, unlike earlier versions, does not import some essential libraries that our sample files need.
If you try, for example, to execute the file Hands3d.pde you will receive the error: Cannot find a class or type named "Iterator", as shown in the figure below:
iterator error
To solve this problem we have to import for the object IteratorI imported the libraries java.io.File (I need italso for other sample files) and java.util.IteratorI show you a picture of the two lines that I put into the source code:
If you will receive other exceptions then you need to check on the Internet (or in the official documentation of Java) which library you have to import.
Another example that I tried (and I want recommend to you) is the Slider2dIt allows you to play with somesquares on the screen using your hands. Below my test:
As I wrote before, there are many interesting examples into the examples folder of Simple OpenNI. I recommend you to try all files.
Sometimes it happens that processing returns an error regarding kinect disconnection. In that case you have to disconnect and reconnect the device.
As already mentioned the source code is well commented. This will allow you to edit the code according to your needs, you will learn to create new programs.
That's all. Have fun!!!