We're working on a wheelchair accessible control system

Our goal is to create a universal attachment for all joystick-powered wheelchairs to accept electromyography signals from the user and move the wheelchair accordingly.

Electromyography signals are concerned with the activity of the nerve cells (or motor neurons) that are responsible for muscle control. These signals can be measured, creating a means of distinguishing between facial expressions to create a control mapping for the wheelchair.

We took on this project to restore a degree of autonomy to those suffering with quadriplegia and other paralytic disabilities.

Blog Updates

We have recieved the Emotiv Insight EMG headset and begin to understand how it operates.

The group takes turns using the headset and checking the accuracy. It is found that it varies in accuracy with the shapes of heads and thickness of hairs.

While two of the team members (whose heads worked well with the headset) focused on controlling the headset, the other two started working on the software implementation. This meeting ended with all members getting familiar with the headset and with group members getting familiar with the documentation.

We have recieved most of the electrical components at this point and this working session focused on getting some of the software working.

The group split up the work in two parts: headset and arduino software implementation. The goal was to get data from the headset and the arduino to output to the board.

By the end of this working session, the group managed to extract raw numerical data from the headset and turn on LEDs from the arduino given one of three inputs.

At this point, we have recieved motors. The group spent most of the workshop getting the motors to respond.

There was trouble getting the motor to respond. It was found that the motor controller was defective and we took the opportunity to implement a motor controller in-board.

We eventually got the motor to work after troubleshooting, switching out parts, and testing. This process took 2 days.

This working session focused on integrating the electrical motor with the headset. This was a very software heavy session.

The team worked on finding a way to extract the facial muscle movement as a categorical output. Then the outputted signal was forwarded as a number to the board.

The meeting ended after we were able to get the motor to respond to the headset.

This working session focused on CADding the mechanical assembly for the system. The group spent most of the time validating measurements with the purchase links and then implementing the frame, carriage, and joystick control stand.

At this point, the team 3D-printed off the CADs and attempted putting together the system.

It was found during building that the tolerances were slightly off so some of the parts didn't fit. The team focused on sanding the edges of some of the 3D-printed parts. We were able to validate that the components fit, aside from the tolerances.

New parts were printed and the team came together to assemble the system.

After assembling and troubleshooting, the two systems were fully integrated. There was a need to modify one edge of the stand where the joystick stands so a team member also worked on updating that part.

The new parts were printed and the team was able to assemble the components. This meeting also focused on calibrating both the headset and the motors.

By the end of this meeting, the group was able to finish the integration of all subcomponents and calibration.