Project240.net

Project 240 is changing to reflect diversified interests in many forms of vehicle dynamics.  Study of drifting dynamics is ongoing and will complement research on the sensitivity of unmanned air vehicles to atmospheric turbulence as well as continuing the study of morphing aircraft.

A new section of Project 240 will be developed soon regarding the dynamics of sub-scale drifting, such as Kyosho Mini Zs and 1/10th scale touring cars.

Other pages in the site will have slowly-expanding photo galleries from various projects and trips.

Drifting
Autocross
Photography
Contact Information

It's been several years since I last competed in autocross, so it was a brilliant experience to return to racing at the Camarillo airport autocross on September 15th, 2013. The race was very well organized, allowing the four run groups with 45 racers total to get through 11 practice and timed laps very quickly. After the two timed events, racers were allowed another 10 or so fun laps, nearly to the exhaustion of racer and racecar alike.

Photo gallery of racers in green, red, and blue run groups

Split-screen comparison of 4 runs: First, practice, timed, and fastests

51.094s Miata run - Camera mounted on trunk

52.567s Miata run - Camera mounted on right door

2013 VCG MB2 Kart Race

Our corporate team competed again in the 2013 Ventura Corporate Games MB2 race, again with good results. We followed the same method of practice, qualification, team practice, driver selection, and competition as we did last year. Our teams won first, third, and fourth in Division A. I only used a single GoPro during the race and changed to a transparent data overlay to show the acceleration and angular rate data synchronized with the video.

Winning lap at 2013 VCG MB2 race with accel/yaw data overlay

Video-Data Fusion of MB2 Kart Race

2012 VCG MB2 Kart Race - Data Analysis

We had been racing the electric karts at MB2 for a few years before our company entered the 2012 Ventura Corporate Games. Our enthusiasm for the kart racing event naturally involved obsessive practicing, multi-angle on-board and off-board filming, inertial data measurement, and coached discussions of handling techniques and racing line strategies. Our three corporate teams won first, second, and fifth place - with the additional honor of having the fastest two males and the fastest female at the race.

I filmed the race with two GoPros mounted on the left and ride sides of the kart. Each camera provides a view of the front wheel, the steering wheel, throttle or brake pedal, and the proximity of the kart to the track barriers. The frames are synchronized and combined together to give a panoramic view of the race. Data overlay from the inertial sensors shows the acceleration cross plot and yaw rate.


2011 SparkFun Autonomous Vehicle Competition

2011 SparkFun Autonomous Vehicle Competition

Promises to test ahead of time were shamelessly broken this year, as I went to the 2011 SparkFun Autonomous Vehicle Competition with an unfinished robot. The vehicle was much improved over last year, with more robust inertial-navigation capability along with laser-rangefinding obstacle detection. Unfortunately, a persistent problem with the serial communication between the image processing computer and the control hardware prevented the system from working.

Although I did not end up running, it's hard to leave an event that featured a recklessly-autonomous 7-foot dinosaur robot without having a great time. The competition this year was fantastic, and much of my report is dedicated to comparing the various approaches taken with the ground vehicle designs. I also include a description of my robot - with particular emphasis on the hardware and control system design.

2010 SparkFun Autonomous Vehicle Competition

2010 SparkFun Autonomous Vehicle Competition

AutoDrift debuts at the 2010 SparkFun AVC in Boulder, CO. The competition requires robots autonomously navigate the SparkFun building, either by air or on the ground. The competition summary page details my vehicle, control design, and competition experience. Not much in the way of racing success, but the AutoDrift platform is maturing and should be ready for autonomous drift trials in summer 2010.

AutoDrift currently performs cross-track waypoint navigation between up to 10 waypoints, and stabilizes the inner-loop dynamics using yaw rate tracking or lateral acceleration tracking. Longitudinal speed tracking controller tracks a desired GPS speed, with high- speed segments between waypoints and low-speed segments on approach to waypoints and during large turn commands. A crude obstacle avoidance controller steers away from identified obstacles and reduces speed when obstacles are detected. The obstacle avoidance is limited to relatively low speeds due to the limited range of the ultrasonic proximity sensors.

Office M&M Dispenser Maze



Short video of my M&M dispenser project (full version), which was made to compete with other candy dispensers around our office (M&M, mint, and jellybeans). My goal was to make a dispenser that was somewhat more entertaining than reaching into a jar, but also reduces the large M&M flux caused by greedy handfuls.
The dispenser maze requires the prospect candy-eater to tilt the maze table and guide the M&M through the sequence to the Win Bucket. There are a number of open wall segments on the maze, beyond which lie the fail ramps that roll the M&M down to the fail bucket. While the M&Ms are still accessible in the fail bucket, candy-eaters must face public scorn and shunning from colleagues if they eat from there.
Control of the maze is provided by a hand-held controller, which uses 8-bit Nintendo POWER/RESET buttons for M&M dropping, and an analog Playstation joystick for table tilting. An ATTiny 861 microcontroller reads the analog position of the joystick and generates PWM commands to drive the pitch and roll tilt servos, which are connected to the table and provide roughly +/- 10 deg tilt angle.

Honey Bee Helicopter versus Potato Cannon



Video documents the environmentally friendly disposal of an old fixed-pitch E-Sky Honey Bee R/C helicopter. An organic compound is used to facilitate the mechanical breakdown of the helicopter components and spread them over a wide area of an environmental impact research facility. Clearly, the solution to pollution is dilution.
The authors of the study would like to acknowledge the contributions of Chuck from Von's supermarket, who helped identify the appropriate tubers that have sufficient mass, cohesion, and workability to function as the dispersion catalyst.

Slow Motion Mini-Z Crashes



Overhead view of carnage during our weekly office Mini Z races. We had an open-intersection in the last few turns of our track, with a partial figure-8 right before the finish line. None of the crashes were intentional (right guys?), but we were rolling footage, just in case.
Filmed using a Photron SA-3 high-speed camera at 1000 frames per second. The video is a cropped version of the full-frame video, which shows more of the moments prior to and following the impacts.

Honey-Bee Helicopter Obstacle Course Racing



Maneuver Sequence:
1 - Take-Off from Pad
2 - Blow origami boulders off table
3 - Touch-and-go on three pedestals
4 - Table loops with touch-and-goes
5 - Table leg figure-8s
6 - Balloon popping
7 - Ladder fly-through
8 - Landing on Pad

Aerobatics Flight Data

Aerobatic maneuvers of an instrumented R/C aircraft demonstrate stick inputs and dynamic response during agile flight. The R/C Flash 3D EPP aircraft is equipped with instrumentation to measure accelerations, angular rates, pitot-static pressure, GPS, and servo motion at 100Hz. The data is recorded to on-board flash memory and is animated and synchronized to the external video in post-processing. DataNinja is used to measure and record the analog signals generated by the IMU, NinjaSense.
Maneuvers demonstrated in the real time video video include inside and outside loops, rolls, point rolls, tail slides, upright spins, inverted spins, tumbles, knife edge, and straight-and-level flight. The slow motion video shows the point rolls, spins, tailslides, and snap rolls in 25% slow motion.

Flight videos and maneuver description at: project240.net/airdata/aerobatics_data.html



Mitigating the Effect of Atmospheric Turbulence: Toward More Useful MAVs

RMIT University, Shifted Dynamics, and Monash University conducted a series of UAV flight tests in a large, wind engineering tunnel. The video presents highlights of tests, which incorporate parametric variation of the aircraft configuration and geometry to assess the suitability for flight in relatively large levels of turbulence. These tests are part of ongoing research into strategies for mitigating the effect of turbulence on the flight of various UAV platforms.

The intensity and integral length scales in the tunnel can be somewhat controlled by reconfiguring the screens, jet, and collector in the various sections. The turbulence levels are designed to replicate those found in various urban terrain.

Flight test videos and research papers at project240.net/mav_turbulence



Recent developments in drift research were presented at the 2006 SAE World Congress in Detroit, MI, April 3-6, 2006.  The paper summarizes the variations in tire force and moment equilibrium at various drifting angles. 

SAE Paper 2006-01-1019 (click to download PDF manuscript)

Abstract
Driving at large angles of sideslip does not necessarily indicate terminal loss of control, rather, it is the fundamental objective of the sport of drifting. Drift racing challenges drivers to navigate a course in a sustained sideslip by exploiting coupled nonlinearities in the tire force response. The current study explores some of the physical parameters affecting drift motion, both in simulation and experiment. Combined-slip tire models are used to develop nonlinear models of a drifting vehicle in order to illustrate the conditions necessary for stability. Experimental drift testing is conducted to observe the dynamics featured in the track data. An accelerometer array on the test vehicle measures the acceleration vector field in order to estimate the vehicle states throughout the drift testing. Neural networks are used to identify the patterns in the accelerations that correspond to sideslip excursions during drifts. These estimates combined with computations of angular acceleration, yaw rate, and lateral acceleration build a framework for identifying the dynamics in terms of physical parameters and stability and control derivatives. The research developments are intended to support a future study quantifying the effects of vehicle configuration changes on drift capability as related to performance potential and handling qualities.

 

 

Drift testing with 4 in-line accelerometers, sideslip camera, and steering angle sensor.  March 12, 2006
Photos courtesy of Shadi Krecht.