Data Collection & Analysis

Setup

We used two phones to collect accelerometer and orientation data, with one attached to the seat and one to the handle of the erg. To process this data, we first loaded the two datasets into MATLAB.

Processing

From the orientation of the phones, the principal axis for the handle data was the x-axis. For the seat, it was the y-axis. As you can see from the collected data, there were small vibrations in each of the other axes, but we did not use these in our analysis.

During the test, multiple good strokes and multiple poorly-timed strokes were taken, as a way to compare. Therefore, we trimmed the data into two distinct tests so that we can compare the plot shapes for good and bad strokes. We also synchronized and interpolated the handle and seat datasets to ensure that our data would align between the two sensors.

Velocity

To get our velocity data from the acceleration sets, we integrated each dataset. Since calculating the velocity via integration introduced significant creep to our datasets, we manually determined linear functions that could be subtracted from each dataset to ensure that the plots would be more or less flat, leaving just the oscillation to be analyzed.

Good Strokes

The plots of the "good" strokes look very similar to our motion model. The seat velocity is positive during the leg drive and near zero into the finish (while the arms and body are still moving, as evidenced by the handle plot). Then, both plots become negative on the recovery as the handle and seat accelerate back toward the catch.

Bad Strokes

On our "bad" stroke plots, the same general shape is observed, but we can distinctly make out an extra impulse as the subject "rushes the slide" on the recovery, breaking the legs early and the arms late. In addition, the smooth velocity curve of the handle during the good strokes becomes much more jagged, symbolizing an inconsistent application of power and an inefficient stroke.

Frequency Analysis

We wanted to see if the changes in the velocity plots between the good strokes and bad strokes can be seen in the DFT. Below are DFT plots of the normalized velocity for the both the seat and handle data to compare the frequencies found in the good and bad sections. Initially, we tried to plot the raw (non-normalized) velocity data, but this data produced DFTs that were much less clear.

The changes in the Fourier Transform between the good strokes and the bad strokes were not very clear, especially in the handle data. This makes sense because any changes that appeared in the velocity graphs had the same frequency as the overall system -- the changes would just be phase shifted and/or amplified, but have the same frequency.

Looking at the seat data, one of the main frequencies we found was 0.29 Hz which is very close to our prediction of 0.28Hz, representing the oscillation of going through one stroke. We also have a frequency of 0.45 Hz which is close to double 0.28 Hz. It makes sense that the magnitude of the 0.29 Hz frequency decreased for the bad strokes data because the dip at the bottom of the graph during the recovery -- the 0.29 Hz frequency would fit the curve slightly less well. It also makes sense that the 0.45 Hz frequency is larger in the bad data because in addition to full stroke cycle, there is an additionally oscillation in the middle of the recovery. A frequency with twice the frequency of the full stroke would map the bad stroke data significantly better than the good stroke data.

In order to glean whether the user is rowing correctly using the DFT, if there is a large magnitude at a frequency double that of stroke frequency, it suggests that there is some extra unnecessary motion during the stroke. However, the DFT won't be able to tell us specifically where in the stroke there is non-smooth motion, as an extra movement in the drive would cause this same behavior. It would be more informational to look at the velocity plots to determine where in the stroke the user is straying from "good stroke" curves and making suggestions accordingly.

Next Steps

To further develop this project, there are a few things we think would be good next steps:

  • Implement a curve fit to normalize the velocity data automatically instead of relying on manual linear function fitting

    • Alternatively, find another way to minimize creep in velocity data

  • Analyze a larger quantity of data to create a more robust model of an ideal stroke, such that deviance from this model can be linked to specific aspects of the stroke that can be improved