This project is all about interfaces that are going to control the synthesiser called Midi. This project is useful in the real world in controlling visual effects in the present world. These Visual effects are kind of methods, practices and technologies relating to creation and manipulation of elements within moving images to elicit a desired emotional response. They often involve the integration of Computer Generated Imagery (CGI) and live-action footage to create realistic environments however it would be dangerous, costly and not easily accessible to the real life. In general these visual effects are controlled by the standard input devices like mouse, joystick, track balls, light pens, keyboards, however there are many virtual interfaces developed to interact the visual effects other than mouse, which is nothing but the Data Gloves. These interfaces are used in live concerts, DJ’s, medical applications, robotics, biomechanics, deaf and speech impaired community as a communication tool, 3d virtual design etc to make their life easier. Just with the few motions they could interact, control, and compose the music as they wish in a much improvised way. This way of interaction reduces the interface bottleneck between the artists and the music. This system also supports multiple artists to simultaneously control the audio. This interface prototype is built upon standard virtual reality software and user interface technology. Data gloves are used to manipulate audio objects and stereoscopic projection to display the virtual 3D sound stage. The aim of this project is to research on these virtual reality interfaces and control synthesisers. So I would conclude as, to make things easier and to improve the latest technologies we are going to find out how the interfaces and the synthesisers work to increase its real life applications by overcoming the difficulties and drawbacks.
Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Aims of this project is to research on ongoing virtual reality interfaces i.e. P5 gloves which are nothing but Data gloves and its application in the real life .Data gloves can be defined as an inventive, glove like peripheral device which is based upon patented bend sensor and remote tracking technologies’, that provide users total instinctive interaction 3d and virtual environments, such as games, websites and educational software’s. These kind of latest technologies are going to make our life easy. In this project we worked on p5 gloves, Glove pie programs and finally controlled MIDI music and audio files using data gloves. In earlier days musicians believed in human capabilities, so they struggled very hard to compose music and to manage live concerts as there was no effective mean source to implement to make their performances realistic and impressive .However they failed as it’s tough to play many instruments at a time so to overcome this they started operating computers for various purposes like multi tasking and to save time. However most of the times they failed in coordinating and synchronizing more number of computers at a time. Due to the complications in their musical life they started using virtual reality interfaces to create imaginative interactive environments that seems to be in the three dimensional real space and to achieve aesthetic effects. In this project we had overcome the problems they faced earlier and exploded more improvised technology to get better output, robustness and versatility. The goal of the project is to control the Midi, virtual and real world applications according to the user. This project is beneficial to musicians, animators, doctors, scientists and many more One sentence.
The objective is to successfully study about the implementation of the data gloves in various fields like music, medicine, animation, education once it is well known ,have to start exploding on this because it is not going to be the end it is an ongoing music and performance art project and moreover it is not limited to particular settings. This project on novel interfaces is definitely going to help academic field as synchronizing the machines is an ongoing issue everywhere and its going to ease the teaching field because by producing dynamic 3D images or the environment we can make students understand the subject well, because practical knowledge is more important than theoretical knowledge. And I believe this is the correct time to work on this project to overcome the present issues as we can benefit more from this device. Because we have already started enjoying its benefits and this is an opportunity to overcome the drawbacks.
Features of glove
To produce outputs
Identify Features controlling
Identify all the drivers to consider for the software and configure
Limitations and advantages
To deliver the objectives of the project we need Data gloves to research with, Midi device, Midi software, visual jockey software, Glove pie programs. P5 midi allows controlling synthesizers and other midi programs using the movement of hand by converting the p5 gloves from essential reality into a Midi controller. P5 not only translates all the information i.e. coming from the p5 glove sensors into Midi messages but also helps us to choose the Midi port and Midi messages. This P5 glove can act as an interface between the computer and Midi or a game console. It easily fits over the hand and senses all its movements in three dimensions. It captures finger bends and relative hand position that enables intuitive interaction with three dimensional environments. It has got 6 degrees of tracking(X, Y, Z, YAW, PITCH and ROLL) with optical tracking technology, bend sensor, anti reflective lens to provide true to life mobility, infrared control receptor with scratch-resistant, and anti reflective lens to provide. Midi is an acronym for “Musical Instrument Digital Interface” which is defined as a music industry standard communications protocol which lets Midi instruments and sequencers talk to each other to play and record music.
In this project we worked on how we can make use of data glove using its various programs and how we can manipulate Midi files using Data gloves, however we are not bothered about how it is going to work in the other real time applications such as medical, virtual reality etc. We concentrated on how we can change the movements of the data gloves by using various factors however we never worked how it is going to work in the real time musical field as we have not tried to test it before any one.
To show how dg can control data s/w
In real life controlling a video on the personal computer using software can be done by the input device called mouse however a greater visual impact to the performer is given by the use of Data Gloves instead of mouse. And the synthesiser used is Midi keyboard which was designed for music input. In particular, Midi keyboard is a very good tool for controlling a large number of instruments in a real time animation system. In this project we had faced compatibility issues while testing the Data Gloves functionality. However we had overcome them using the other alternatives and achieved the outcome on time.
Recently Computer has made it possible to manipulate and operate larger and larger amounts of information, however humans are cognitively ill-suited for understanding the resulting complexity. All the information is readily available; however users are failed in accessing individual items or maintaining a global context of how the information fits together efficiently. Recent studies in virtual reality using Data Gloves technology suggest that encoding subsets of the information using multimedia techniques and placing the resultant visualizations into a perceptual three-dimensional space will increase the amount of information that people can meaningfully manage. Data glove plays a very important role in recognizing hand gestures (which is a complicated task as they are just temporal sequences of hand configurations) and in three dimensional animations. Data glove interaction improves flexibility, usability, and re-usability of 3-D environment applications because
* It can be easily encapsulated to a variety of applications.
* Can be used for both two dimensional and three dimensional – even though the gesture methods are likely to be moderately different for both areas
* It imposes easy navigation, navigation
* Makes the techniques easily available for variety of users like adults, occasional users, professionals, naive users and children.
Application areas of the Data Gloves are:
* Virtual reality applications
* Planning systems
* Computer supported teaching (teaching), and self learning.
* Music applications.
* CAD architecture and design.
* Test and simulation systems.
* Scientific modelling
As Midi is a hardware specification and standardised control language that makes it possible for electronic instruments processor controllers, and other device types to communicate control and performance related data in real world. It helps the beginning aspiring artists, musicians, composers or who are working professional because the Midi workstation can also act as a portable all in one keyboard instrument that includes a polyphonic synthesiser, built-in sequencers, integrated keyboard, percussion sounds, and audio recording capabilities in a single hardware package. Midi has become an indispensible live performance tool for many musicians because of its ability to serialise background parts and rhythm in advance, chain them together into a single, controllable sequence and play them on stage. Midi also provides the ability to control over inter active loops or pre programmed sequence over video play backs and on-stage visuals. Midi adds a varied and fresh feel to the musical performance for those who are on the stage and in the audience. Apart from the control over on-stage music performance, pre produced sequencing and lighting; Midi can play a strong role in the execution and production of on-stage lightning special effects. Midi enables drum machines, samplers, sequencers, electronic drums, synthesisers, digital reverbs and delays, home computers and guitars and all sorts of other music and music related gears to be inter-connected so that we can control and play several pieces of equipment from central device. Midi also provides a common timing source for synchronising drum machines and sequencers.
Apart from the benefits, Data Glove has some perceived problems in its usage. The movements of the Data glove are limited as it is connected to the receptor through wire and sometimes it exhibits delays in catching up the movements. Fingers can exhibit sporadic movements and can have bad effects if they fit the fingers badly. These are the common problems we are going to face with its usage in any other area or real time applications.
SCOPE AND LIMITATION
Problems faced by Data Glove are reliability, both from physical and calibration point of view. Gloves become highly frustrating and non intuitive to use if they badly fit the fingers. Fingers can exhibit sporadic movement or even take on physical impossible shapes. The movements of the controller are limited because they are dependent upon the receptor which is picking up the location of the sensors. When the sensors cannot be detected by the receptor, then the movements of the Data Glove will not be registered. And as the Data Glove is connected to the receptor through the cable, it limits the user’s amount of movement and range. Sometimes Data Glove virtual controllers found slight delay in the movement time depending on the computer speed as well. Absence of left hand model and a tired arm after prolonged use of Data Glove are few more limitations of the Data Glove.
The scope of this project is to work only on the functionality of the Data Gloves in the system control and Midi management by overwhelming the drawbacks; however we are not bothered about how the Data Glove is going to be used in other real world applications. And we don’t have participants in this project to test because finally we just dealt with the expansion of Data Gloves applicability in controlling the Midi files. However the project quality is maintained by testing the results repeatedly in the virtual environment by me before the supervisor. So this is all about movements of Data Glove movements in terms of Midi.
In this project we controlled the computer operations using the Glove movements and hand gestures. When we move the Data Glove in front of the receptor tower which has got two infrared sensors in its range, it receives the hand gestures. These 2 infrared sensors are going to detect the visible LEDS on the Glove (they are eight altogether) and convert them into an (x, y, z) position for the Glove and an orientation in terms of Pitch, Yaw and Roll. The Glove uses a 6 bit A/D converter with a resolution of 64 intermediate positions between a fisted and a flat hand. The Glove is plugged in to the tower first, which is then connected to the pc’s USB port. The Glove also has bend sensors in its fingers and four buttons on the top. The p5 is an amazing piece of hardware below shows the diagram of the Data Glove.
Pitch is nothing but the rotation about the x-axis; Yaw acts around the y-axis; and Roll acts around the z-axis. A positive pitch rotates the hand upward; a positive yaw turns it to the right; and a positive roll turns the top of the hand to face right. The fingers bend data depends on the Glove calibration settings. (P5 is calibrated via its windows control panel, which comes as part of its installation software). An interactive Glove is made from a light weight material into which transducers are sewn to measure finger joint angles. These transducers can be fibre optics or strain gauges which changes their physical characteristics when they stretched. Gloves are mainly designed to use in the virtual environments. It monitors the assimilation of the fingers and an extra tracker on the wrist keeps track of the position and orientation of the hand. However together they enable a complete virtual hand to be animated within a virtual environment. There are four buttons on the top face of the Glove, which are labelled as A, B, C, and D. When D button is pressed the Glove automatically switches off.
A wired Glove is Glove-like input device for the virtual reality environments. Power Glove was first developed by “Mattel Intellivision” Company for entertainment applications. It is a most common hand measurement device which is based upon infrared remote tracking technologies and proprietary band sensors which are used to interact with 3D and virtual environments such as educational software’s, video games, websites and many more. It’s a kind of USB peripheral device that captures the finger movements using optical system-infrared signals rather than sound waves. And various sensor technologies are used to capture physical movements like bending angles of the joints of the thumb and the lower and middle knuckles of the other fingers, also extended to measure abduction angles between the fingers. These movements are translated by the respective software which accompanies the Gloves because even one movement can mean any number of things. These Gloves can also be used as an output device by providing hap tic feedback, which is a simulation of the sense of touch. Motion trackers like magnetic tracking device or inertial tracking device is often attached to capture the global position or rotation data of the Glove. This Glove offers six degrees of tracking such as x, y, z, yaw, pitch, and roll. This is designed in such a way that it is compatible with the Microsoft Windows operating system and Apple Macintosh operating system. The Data Glove was developed as a gesture recognition tool. This Data Glove is based upon fibre optic technology. Many types of Gloves like Digital Data Entry Glove, MIT LED Gloves, Super Gloves, Fifth dimension Technologies 5th Gloves, Sensor Gloves are also developed till now for the purposes of real time computer graphics, animations and gesture recognition, Design research, and robot control applications.
We have got few IEEE papers from the internet where in they worked on the Data Glove in various fields. However we have considered only few papers that closely related to my project. In the paper titled “Techniques for selecting and manipulating objects in the virtual environment” by “Yingzhen Liu” and “Gang Wan” they tried to prove interaction with the virtual objects in virtual environment using Data gloves, is more natural, realistic and efficient than using a mouse to increase the user’s immersion. They tried to work on the human computer interaction by measuring the finger motions, finger flexure and abduction between the fingers to recognize the gestures from the raw data collected by the computer correctly. They first tracked all the data from the Glove, then constructed a frame by converting the data into gestures using glove sensors and compared it with the user’s real hand gestures. As the virtual gestures kept changing with the users they had set particular boundaries for the display. Using these strategies they managed the selection and modification of the virtual objects in the virtual environment by the virtual hand i.e. human Data Gloves realistic. However further studies are going on this research based on various properties.
Find Out How UKEssays.com Can Help You!
Our academic experts are ready and waiting to assist with any writing project you may have. From simple essay plans, through to full dissertations, you can guarantee we have a service perfectly matched to your needs.
The second paper which was accessed on 27th November from the website ” http://homepages.inf.edu.ac.uk ” was “Motion Editing With Data Glove” which was worked by “Wai-Chun Lam”, “Feng Zou”, “Taku Komura”. In this paper they proposed a new method to edit captured human motion data by using the Data Glove. This method is used not only for editing human motion, but also for controlling human figures in real time environments such as games and virtual reality systems. They generate a mapping function that converts the motion of the hand to that of the whole body by wearing the Data Glove and a new motion out of the existing motion captured data in database, and to apply the motion to characters with different hierarchical and retargeting methods and body sizes. In this paper they introduced a new dynamic editing method which is called data gloves to overcome the gap of the nature of editing methods and that of human motion. In this project they wear the Data Glove and display, mimic the human motions on the graphical display to generate the motion synchronous to the human gait appearing on the screen by moving the index finger and the middle finger. They followed two stages i.e. capturing stage and reproduction stage to work on this concept. Finally by capturing the human motions and reproducing them in a creative way they have succeeded in proposing a new method in editing human motions using the Data gloves.
The third paper we have considered is “Data glove Calibration with Constructed Grasping Gesture Database” by “Bin Wang” and “Shuling Dai”. The main aim of this paper is to improve the precision of the human hand Data Gloves motion measurements and to construct a human hand model suit for general purpose instrumented Glove applications. This paper contributed to achieve grasping gesture database construction without the need of any external sensors, a reliable and discreet calibration routine that can handle cross coupling errors of the sensors. They presented a model based gesture construction technique for establishing calibration database and a calibration routine for the instrumental Glove that precisely and quickly adjusts the Glove to fit a particular user. After a few experiments using kinematics they finally succeeded in proposing a complete Data Glove calibration method, identifying the Data Glove impressions precisely without any help of external sensors, and also handling the cross coupling errors using calibration routine .
In this project we worked with the dynamic input device like Data Gloves to construct an interactive 3D virtual design with the use of software. Because standard input devices do not mostly resemble natural hand motions. This involves recognition of hand gestures and their implementation. In this way we created virtual environments according to the music in a dynamic, lively and sprightly way so that these creations can be used by the musicians, DJ’s and many more to make the audience feel and appear like in a real environment and succeeded in controlling Midi files in the virtual environment. Transmitting software is used here to track the motions of different parts of the body and changes the virtual environment accordingly.
Finally we would like to summarise what we have done in this project is, we have used Data Gloves as an input device for the computer using various attributes in various ways and controlled the Midi files output in the virtual environment by taking the Midi files as an input through Midi devices which acts as a bridge between the Glove Pie software and Midi files
The equipment required to research in this project is Data gloves which is provided by the university and few software’s that are available online for free and as we have not practiced it before any one there is no need of participants in this project as we have not practiced the results before anyone.
To research and produce the results we followed a particular protocol, a methodology which helped us to achieve expected error free results on time. Analysis on this project helped us to achieve the target easily because in the phase of analysis we have designed what to do, how to do and started implementing them one by one by experimenting the data gloves and running various Glove pie programs by modifying their attributes. And then we started testing Midi files using data gloves in the ” SynthEdit” software which is nothing but a freeware windows application that uses a modular Visual Programming Language to create music synthesizers and effects units. It Provides a GUI(Graphical User Interface) editing system with full of Midi interface for hardware controllers and allows users to create Visual Studio Technology(VST) effects and Visual Studio Technology instruments. All the audio and Midi plug ins for “SynthEdit” software are coded in “C” and “C++” languages using “SynthEdit Music Plug In Standard” application programming interface i.e. based upon “Generalized Music Plug In Interface”. In order to test how we control Midi files using data gloves we have to install the Midi driver first, which acts as a connecting drive between data gloves and Midi software. We have installed “Midi Yoke” as the Midi driver. It is a Midi’s patch cable driver which is used to connect any applications output to any other applications input. After that we installed P5 Midi software which is used to transfer the P5 glove movements from essential reality into midi controller. It takes the hand movements coming from P5 glove sensors and converts them to Midi messages. This is used to control all the Midi programs and the Midi synthesizer using a single hand movement. It can be used either with Midi synthesizer or with Midi device and it allows selecting the Midi port to which the messages are sent by connecting the synthesizer directly to the Midi port. Midi is really a wired protocol specification that describes the transmission of data from one Midi enabled device to another. Midi defines a set of messages that travel over dedicated, synchronous serial channels. There are two sorts of messages, Midi short messages and system messages. The short messages are made up of one to three Midi words where each Midi word consists of a start bit, data bits and a stop bit. They contain information such as note beginning and end, volume, and other sorts of music gesture information. The system messages can be broken down into system exclusive messages, which can be any length and are used to configure and manage Midi equipment, and active sensing messages. Active sensing messages are transmitted at a definite frequent interval to indicate that a controller is still alive and active. The Midi short messages include Midi channel information. Finally we require visual jockey software which is a three dimensional animation software, where we finally implement the procedure in real time. Using this we finally play the Midi keyboard notes using the data gloves which is quite exciting. We follow a particular protocol to connect the data gloves, Midi software and the Midi drive to generate the anticipated outcome. All the software we require for this research can be found on the internet for free.
To bring out the expected outcome on time there is a need to follow a particular protocol which is nothing but a set of rules and regulations. Because disruptions in the project may lead to unexpected typical sequences which is going to affect time and final result. So the methodology used in this project is, first we collected all the data required to process further and then started working on the software’s downloaded one by one. Glove pie programs are easy to run and implement. We worked on various glove pie programs to know how the movement changes with various parameters changing, using the data gloves. A few Glove pie programs, variation in the output by changing the existing programs are presented below.
” // Grabbing the bow string:
//var.GrabbingBowString = pressed(var.CanGrabBowString and p5.z > -700)
var.TryingToGrabBow = pressed(var.CanGrabBowStr0ing)
var.GrabbingBow = False
if ((not var.HoldingBowString) and (var.ValidBowGrip) and var.TryingToGrabBow and (p5.z > -700)) then
var.GrabbingBow = true
var.HoldingBowString = true
Debug = “Grabbing Bow String”
// Pulling back the bow string
var.DrawingBow = var.HoldingBowString and p5.zVelocity < -800
if var.DrawingBow then
debug = “Drawing Bow”
var.UndrawingBow = var.HoldingBowString and p5.zVelocity > 800
var.BowDrawnBack = var.HoldingBowString and p5.z <= -1100
if var.BowDrawnBack then
debug = “Bow Drawn”
end if “
In the above program grabbing and pulling back the bow are the parameters for data glove which are used to operate the Midi files via Midi drive. We connect this above program to the Midi driver i.e. installed, by changing the GUI output to Midi driver which is Midi yoke and then execute it. When we run this program depending on the hand movements that are detected from the data gloves the output is generated which is an input to the Midi software through Midi yoke where in we can play the music notes from the Midi keyboard. However this can be tested using the Midi “Synthedit” software which helps in making out the unknown mistakes. As this is testing software we can test the outcome of this methodology by monitoring and implementing them. We change the settings of this software by changing the Midi in port to Midi yoke and Midi out port to either Microsoft Realtek sound card or Midi yoke likewise in a loop system. We create a project by connecting the Midi in to the Midi Synth then midi Synth to the Midi out and the sound card, finally Midi monitor is connected to the Midi in. When we execute this program all the hand movements of the data glove will be tracked by the Glove pie software accordingly and then output generated by the Glove pie software will be sent as Midi input to the Midi in port in the Synth edit software by the Midi yoke driver. This Midi input is also connected to the Midi Monitor which is going to show all movements on the screen. This Midi input is sent to Midi Synth which is the one i.e. going to process the Midi files using Midi input and the output is sent to either sound card or Midi. If the output is sent to sound card (Microsoft sound card) then we can hear it from the Microsoft sound card which is integrated on the laptop. If the output is sent back to Midi out port then we can view the affect on the Midi files in the Midi monitor. Let us see the program we had executed below
When we were successful in controlling Midi files using data gloves we moved to the next phase which is executing the same technique using visual jockey which is going to be so realistic. In this we are going to use P5Midi as Midi software and used Visual Jockey software to perform this experiment. By performing this experiment we can play keyboard notes, drum beats using the data gloves. We changed the settings of the software by adding the Midi capture device as Midi Yoke Note 1 and then we set up the keyboard events by going to the option “set up events” and then if we click on add events a keyboard will be opened waiting for the Midi input, when we send the movements through the data glove we can see and hear the notes playing from the keyboard. We can also use the Glove pie as the glove software instead of P5Midi. When we execute a Midi program then the hand movements detected by the sensors will be received and generated as an output for the Glove pie programs this output can be sent as a Midi in to the Visual Jockey software through the Midi device called Midi Yoke Note 1, then this output can control and play the keyboard notes. When we select the Midi device we have to make sure we selected the correct respective Midi drive in the Glove pie and Visual Jockey software. Below shows the look of playing keyboard in the Visual Jockey software using Midi Yoke Note 1 as a device.
REASON FOR CHOOSEN TECHNIQUES
In this project Midi acts as the interface between the computer and Data gloves. The reason we have chosen Midi is, because of its fast paced advances in the latest technology which allows constructing complex hardware designs cost effectively and its standardized protocol. The reason we have incorporated Midi standard into our project is because its design could respond or transmit to control related data conforming to the Midi specifications and digital performance, moreover any project i.e. conformed to the Midi specifications would work without any errors. Its increased control though out an integrated production system and its potential for future expansion has generated an image in the present industry. As it is Midi we were able to cost effectively realize a full scale sound production in the project. Usage of Midi in the real time allows us to listen and edit its production at each and every stage of its development. Midi allows the performance tasks to get layered, altered, edited, multilayered and improved under completely automated computer control with relative ease. Different types of Midi related devices help us to choose the productive system that best suits our own project style. And also through this particular methodology we can achieve the outcome in a very less time with a limited resources and technology. This helped in saving time and money as well.
EXPERT METHOD CHOOSEN
The expert m
Cite This Work
To export a reference to this article please select a referencing style below: