Motion Laboratory

| About | News | Contents | Projects | Publications |
Analysis of Rotation Movement in Ballet using Motion Data from Different Dates

In physical training such as sports, we learn basic poses and basic movements by practicing continuously. However, proficiency of motion is difficult to determine visually. The goal of this research is to clarify the required elements of mastering ballet skills and to support people in learning ballet skills. In this study, ballet skills are quantitatively analyzed using motion data of ballet obtained on different dates. The target is the "Grand Fouette" rotation movement, which has a high level of difficulty and is often used in the scene of ballet pieces. First, features of "Grand Fouette" are analyzed based on ballet theory. Next, eleven feature values are defined, such as the rate of face direction to the front during rotation, and the knee angle at the time of Plie. Then, the feature values from four different dates for every half or full year are calculated and compared. As a result, it was confirmed that there are characteristic differences in the trajectory of the toe of the supporting leg and in the rate of face direction to the front during rotation.

Distinguishing of Leg Motions in Noh by Machine Learning using Acceleration Data

The goal of this research is to produce live stage performances featuring interactions between actors and CG images projected on the stage. In this research, I attempt to distinguish the leg motions of Noh by machine learning using acceleration data. The acceleration data of a sliding foot and a stamping foot are learned by machine learning. The sliding foot is represented by an X-axis, and the stamping foot by a Y-axis. The data for verification are inputted to distinguish sliding foot, stamping foot, or other motions. In order to distinguish each motion, seven acceleration files were used and verified. As a result, two of the four sliding foot motions, both of the two stamping foot motions, and three of the five other motions were correctly distinguished. This approach was thus able to distinguish the motions that the previous approach using threshold values could not distinguish.

Idol Dance Theatre: An Interactive Dance Motion Viewer with a Virtual Idol on the Web

The purpose of this research is to develop educational and interesting applications using motion data. I have developed an entertainment system to interactively view and learn the motions of dance through 3D animation on the Web. This system can play 3DCG animation of ballet, hiphop, and Japanese Noh dances with a character embodied as a virtual idol. I created this CG character using Metasequoia and developed a GUI using WireFusion. The system is based on a JavaApplet, which can be easily operated on the Web using Web3D technology. The interactive functions of the system include costume changes and playback control. As a result of evaluations made by students, this system incorporating a virtual idol was effective for viewing dances.

A Retrieval System for Ballet Steps using Body Translation Information

The purpose of this research is to construct a database for managing three-dimensional motion data. In this research, a motion database for the basic motions of ballet was constructed, and a system to search for motions using three-dimensional position information was developed. In this system, a motion can be selected by with a GUI and then displayed in a motion list. This system can search for motions using the direction of body translation and the maximum distance from the start points which are computed beforehand. Furthermore, the selected motion can be previewed as 3D animation. I conducted an experiment using 12 students who had no experience with ballet in order to evaluate the usefulness of this system. As a result, the subjects found this system easy to operate, and it turned out that the retrieval method using body translation information was useful for even an inexperienced person.

A Simulation System for Stage Performances using TVML

I have developed a simulation system for stage performances to that can interactively control several types of media such as movies, music, and CG animation. The purpose of this research is to support the creation of stage performances and to archive digital contents. This system can simultaneously control videos, 3D animation, music, and subtitles using a TVML Player. The system has interactive functions that can switch the video screens, playback music, control CG character animations, and change the camera view using a keyboard. I reproduced the scenes of a stage performance of "A Spider's Thread" using the system. In addition, I have created digital content incorporating a subtitle script as well as a CG character with Noh motion. As a result of an assessment experiment, I verified that the combination of movie and CG animation was effective to easily understand the story.

Comparison of Dance Motions based on Principal Component Analysis for Body-part Angles

The purpose of this research is to compare classical ballet, contemporary dance, and Japanese Noh by using motion-capture data. The motion data contain position information on 25 points. Principal component analysis was conducted for the angles of body parts. First, the angles of head, bust, upper body, both elbows, shoulders, knees, ankles, and foot are calculated as thirteen feature values by using the inner product of body part vectors. Then, principal component analysis was applied to the thirteen feature values. Negentropy was computed from principal component scores obtained by the principal component analysis. Distribution charts of principal component scores having large differences in Negentropy between dances were created. As a result, the angle of the left knee and right shoulder in ballet, the angle of the head in contemporary dance, and the angles of both ankles in Japanese Noh were found to be specific features.

A Support System for Choreography Creation by Body-part Motion Synthesis using Motion Data

The goal of this research is to develop a support system for choreography creation using the human body motions obtained by motion capture systems. I have developed a system for simulating choreography by synthesizing body-part motions. The system target users are dancers and choreographers. It is assumed that the created choreographies can be demonstrated by actual dancers. In order to operate the system easily and intuitively, a touch panel is used as the user interface. The motion data is obtained from a professional dancer and then segmented into seven body parts. The user can simulate the work of choreography by motion synthesis, with unrestricted timing and editing of the choreography. In the motion synthesis, the user simulates the synthesized motions by selecting various body-part motions and composing them as whole-body motions. The motion synthesis is done by replacing the base motion with the body-part motion. The composed short choreography is displayed as CG animation in real time. In the choreography editing, the user saves the motions created by motion synthesis and then composes choreography by arranging the saved motions on a time line. A dance-creation experiment for contemporary dance was conducted to evaluate the utility of the system. Eight dancers with experience in choreography creation evaluated the system. The eight dancers used the system of compose choreographies and then created and demonstrated a short dance while adding a nuanced expression to the composed choreography. After the experiment, I received several comments: "This system can enhance dance creation," "The detail of the body motion can be checked," and "A body part that is not usually used can also be trained." The experiment confirmed that the system was useful for choreography creation, understanding motions, and dance training.

  • Yoshiyuki Kohno, Asako Soga, Masahito Shiba, A System for Motion Synthesis of Human Body Parts using a Touch Panel, Proc. of the 9th ACM SIGGRAPH International Conference on Virtual Reality Continuum and its Application in Industry, pp.145-146 (Seoul, Korea), Dec. 2010
A Study of CG Expression with Motion Data for Stage Production and Content Creation

The purpose of this research is to study the use of 3-dimensional motion data for the arts. The contributions of this study include the expression of CG animations using motion data for stage production of Noh style performances and CG content creation. Specifically these contributions involve the creation of CG characters, the creation of pre-rendered CG animations, and motion data conversion for real-time CG rendering. Motion data are captured from a professional Noh actor's performance. In the enhanced stage production based on Noh-Kyogen, CG animations are projected onto two screens on the stage. Scenes featuring background, many virtual actors performing based on Noh motion data, and flashbacks portrayed by CG animations help to achieve scene productions that are difficult to express solely by real actor's performance. In addition, motions that are difficult to obtain by motion capture are created by editing Noh motions. The results of a questionnaire completed by spectators after the performance confirmed that using the visual effects of CG animation in stage production is effective, that the collaboration between traditional art and CG is useful for understanding the contents, and that it is possible to expand the audience demographics of traditional arts. A simulation technique that can generate different CG animation scenes each time was proposed. "Attractive motions" are defined based on motion data and CG characters with the motions are allocated suitable position, orientation and timing. CG animation that opens automatically and move painted pictures of Emaki is expressed by the movement of viewpoint from left to right in the manner of actually viewing Emaki, decoration of Emaki, and automatic allocation of CG characters and CG objects.

| List of Student Projects |