IMAGE: EDUSENSE IS A COMPREHENSIVE CLASSROOM SENSING SYSTEM THAT PROVIDES INSTRUCTORS DETAILED DATA ON THEIR OWN TEACHING AND STUDENT ENGAGEMENT. view more
CREDIT: CARNEGIE MELLON UNIVERSITY
While training and feedback opportunities abound for K-12 educators, the same can't be said for instructors in higher education. Currently, the most effective mechanism for professional development is for an expert to observe a lecture and provide personalized feedback. But a new system developed by Carnegie Mellon University researchers offers a comprehensive real-time sensing system that is inexpensive and scalable to create a continuous feedback loop for the instructor.
The system, called EduSense, analyzes a variety of visual and audio features that correlate with effective instruction. "Today, the teacher acts as the sensor in the classroom, but that's not scalable," said Chris Harrison, assistant professor in CMU's Human-Computer Interaction Institute (HCII). Harrison said classroom sizes have ballooned in recent decades, and it's difficult to lecture and be effective in large or auditorium-style classes.
EduSense is minimally obtrusive. It uses two wall-mounted cameras -- one facing students and one facing the instructor. It senses things such as students' posture to determine their engagement, and how much time instructors pause before calling on a student. "These are codified things that educational practitioners have known as best practices for decades," Harrison said.
A single off-the-shelf camera can view everyone in the classroom and automatically identify information such as where students are looking, how often they're raising their hands and if the instructor moves through the space instead of staying behind a podium. The system uses OpenPose, another CMU project, to extract body position. "With advances in computer vision and machine learning, we can now provide insights that would take days if not months to get with manual observation," said Karan Ahuja, a member of the research team who is pursuing his Ph.D. in the HCII.
Harrison said learning scientists are interested in the instructional data. "Because we can track the body, it's like wearing a suit of accelerometers. We know how much you're turning your head and moving your hands. It's like you're wearing a virtual motion-capture system while you're teaching."
Using high-resolution cameras steaming 4K video for many classes at once is a "computational nightmare," Harrison said. To keep up, resources are elastically assigned to provide the best possible frame rate for real-time data.
The project also has a strong focus on privacy protection, guided by Yuvraj Agarwal, an associate professor in the university's Institute for Software Research. The team didn't want to identify individual students, and EduSense can't. No names or identifying information is used, and since camera data is processed in real time, it is discarded quickly.
Now that the team has demonstrated that they can capture the data, HCII faculty member Amy Ogan said their current challenge is wrapping it up and presenting it in a way that's educationally effective. The team will continue working on instructor-facing apps to see if professors can integrate the feedback into practice. "We have been focused on understanding how, when and where to best present feedback based on this data so that it is meaningful and useful to instructors to help them improve their practice," she said.
This research has been presented at Ubicomp, the International Conference of the Learning Sciences, and will be presented this coming April at the American Educational Research Association annual meeting.
In this article I will show how you can use Microsoft Kinect for xBox 360 on Mac OSX. In this way you can create applications that interact with the body's movements.
Introduction
This article is intended for people who have a lot of experience in the Information Technology area, both as a developer and as systems engineer, especially on unix systems. In fact, the installation of the drivers may be a little tricky, especially if something does not go the first time.
I warn you... there are some commands to run with the terminal, I do not take any responsibility if with these commands (or connecting the kinect) you will damage your Mac. However, if you are familiar with the shell (and unix systems) you should not have problems.
The version of Kinect that i have is sold separately from the xBox. It has the power and the USB adapter included in the package. If you have the version bundled with the latest xBox (that doesn't have the adapter with the power supply), you will need this device from the Amazon website.
I connected the Kinect to an iMac with OSX 10.7.4 64-bit. However, if you read the whole article you'll be able to adapt the installation process on different systems without much effort.
Well, now that I have described the tools used for testing, we can install and configure the software and drivers required.
Driver and SDK
Before proceeding you should know that there are several available API and SDK for Kinect. However, the two main SDK are OpenNI and OpenKinect. The first is maintained by the company PrimeSence who developed the technology behind the Kinect, OpenKinect instead is a group of people who formed a community called OpenKinect which issued the library libfreenect. There is also the official SDK released by Microsoft, but unfortunately it only works on Windows operating systems. I willanalyzethat in thenext articles.
For this tutorial I used OpenNI. This is because that SDK is developed by the company that has created the technology behind the Kinect, and it comes with many examples. I used a wrapper for Processing (a programming language based on Java) called Simple OpenNI.
Simple OpenNI is a very good project, but I had to make a number of changes to the installation process to adapt it to the version of Mac OSX that I have. All changes are listed below.
Let's start
First of all, open the page with the official procedure to install Simple OpenNI on Mac OSX clicking here. As described aboveI had toslightly changethe installation procedure. Here's how:
I download xcode4 fromhere. Then I have installed it.
I download theversion 2.1.2ofMacPorts(the versionthat is shown in theinstallation guide istoo old) from here. Then I have installed it.
I have openeda terminal andran the command: sudo port install git-core ifthat commandreturns thefollowing error: Unable to open port: can’t read "build.cmd": Failed to locate ‘make’ in path: '/opt/local/bin:/opt/local/sbin:/bin:/sbin:/usr/bin:/usr/sbin' or at its MacPorts configuration time location, did you move it? then you have to start xcode, go to Preferences -> Download and install the "Command Line Tools", then you can try again the command sudo port install git-core.
Again fromterminalI ranthe following command: sudo port install libtool
Finally, this: sudo port install libusb-devel +universal ifthat commandreturns thefollowing error: Please do not install this port since it has been replaced by 'libusb' thenyou have to runthe following commands:
sudo rm -f /opt/local/lib/libusb-1.0.0.dylib
sudo port clean libusb
sudo port install libusb +universal
I download OpenNI_NITE_Installer-OSX from here, I have unzipped it and then ran (inside the unzipped folder) the command sudo ./install.sh Thiscommand will install thedriverto allowthe proper functioning ofKinect.
I download and installed processing2.0 forMac OSX from here. This is because the version 1.5.x has some problem with the latest versions of Mac OSX. We have to do few changes into the source code of the examples providedwith Simple OpenNI. LaterI'll show youhow to do that.
Now we need todownload the librarySimpleOpenNIforprocessing fromhere. I have unzipped 'SimpleOpenNI.zip'. In this way we geta folder called SimpleOpenNI. You have to copy the folder SimpleOpenNI into the directory /Users/'your username'/Documents/Processing/libraries. If you don't have this folder then create it. The result is shownin the screenshot below.
The folderSimpleOpenNIalso contains severalexamplesthat show how to usekinect, of courseall writtenin Processing.
The Code
Connect theKinectto thewall socket and to the iMac,openProcessingandasample filefrom /Users/'your username'/Documents/Processing/libraries/SimpleOpenNI/examples/OpenNI. OpentheexamplefileUser3d.pdefrom the folder User3d.
Ifwe run the code(nb: you must standat least 1.5 metersfromkinect) we shouldobtain an imagelike the one below. You can see a sort of skeleton that follows the movements of my body:
TheUser3d.pdefile(such asthosepresentin the samples folder)is well commented, if you read the code(and the comments), you can learn how to customizethe code orcreate new programs.
You have tomakesome changesto the source code before tryothersample files.You should know thatProcessing2.0,unlike earlierversions, does not import someessential libraries that oursample files need.
If you try,for example,toexecute the fileHands3d.pde you will receivethe error: Cannot find a class or type named "Iterator", asshownin the figure below:
To solve this problemwe have to importfor the objectIterator. Iimportedthe librariesjava.io.File(I need italso for other sample files) andjava.util.Iterator. I show youa pictureof the two linesthat I put into the source code:
If you will receiveother exceptionsthenyou need to checkon the Internet(orin the official documentationof Java)which libraryyou have to import.
Another examplethat I tried (andI want recommend to you) istheSlider2d. It allows you to play with somesquares on the screen usingyour hands.Belowmy test:
As I wrote before,there aremany interesting examples into the examplesfolder of SimpleOpenNI.I recommend youto try all files.
Sometimes it happensthatprocessingreturns anerror regarding kinect disconnection. In thatcase you have to disconnect and reconnect thedevice.
As already mentionedthe source code iswell commented.Thiswill allow you toedit the code according to yourneeds, you will learn to create newprograms.