Skip to main content

Gypsy, The Software, T-24 hours…

While Lil’ Joe sports a fairly simple program, Gypsy ended up as one of the more complex programs I’ve written in NXT-G. The program (still named “Nadar”) really needed to handle several different tasks at once. Primarily a photographic platform, it had to be able to operate the camera in both still & video modes, and take interesting pictures while controlling the platform. As a secondary goal, it needed to datalog the numerous environmental sensors hooked up, as well as “engineering data” (like battery voltage, platform angle, motor noise, etc.). On top of this, it needed to monitor and control the payload temperature, critical to keeping the payload alive. And to complicate all of this, the program needed to be as small as possible, leaving as much memory as possible for datalogging.

In order to handle the photographic challenge, the program uses the same approach as the original Nadar: following a script. It reads a series of timed commands from a text file (think of it as a “photographic program” that Gypsy follows), executing each command (“take picture”, “turn on video”, “tilt platform”, etc.) and then waits a specified time until it reads & executes the next command in the file. This way the programming code for, say, “turn on the camera, take a picture, and turn off the camera” only needs to appear once in the program – it is just invoked every time it’s needed, saving huge amounts of memory over a simple “string of Motor blocks” style program (for comparison, several hundred still photos and five videos, with numerous repositionings of the pitch of the platform, take only 0.4 k of the NXT memory when stored as such a “script file”).

To deal with the simultaneous issue of datalogging, the program uses a similar system of “timed events”. The same loop that handles waiting for the next timed camera command also checks if, for instance, two seconds has elapsed since the last pressure & temperature logging – if it has, it logs the current pressure/temperature, and resets this timer as well. This same process occurs (with varying timed intervals) for engineering data, heater checks, etc. There’s no reason to do these in multiple parallel sequences… after all, most of the time all these would be just sitting around “waiting” for the right time to elapse. So instead, they all sit in the same loop being checked in series. This means the program can execute faster as well. The internal temperatures and the state of the internal heaters are also handled by this “timed event” strategy, so the program is fairly easy to troubleshoot – most of the sections are working the same way.

There’s a lot of data to record as well: ambient pressure, three different temperature sensors, external light level, the angle (or pitch) of the platform, NXT battery voltage, when each picture was taken (& if it was a picture or video), when the platform angle is changed (& how loud the motors sound during this event, as well as the background noise level, and how loud the NXT “Beep” sounds at this point as well), the two-second average acceleration and sound level, etc. To put as much as possible into the datalog, all this information is “compressed”, using a very simple data compression scheme coded up in NXT-G, that packs information about twice as dense as a text file “normally” would. The bit that does the datalogging and data compression is all in a self-contained My Block to make things easier – you wire in the information, and it handles writing it to the file, opening a new file if the old one is full or flawed, timestamping it, and compressing everything.

On top of all this, the program can switch to a different photoscript just by changing a single variable. Taking advantage of this, when the program detects a free-fall event (like when the balloon bursts), it will switch photographic scripts, as well as change the frequency of datalogging certain variables, so it can adjust its behavior under its own control during the mission (hey, otherwise it’s not a robot, right?).

How big is this complex, multi-tasked program? Remember, I need to save as much space as possible for datalogging all the information from this multi-hour mission as well, so size is critical. Well, it turns out on the NXT it occupies only about 20k – under 20% of the available memory. The “secret” (if that’s the right word) is that I simply reuse code – I do not follow the more common “string of Motor blocks” style that often gets used in many NXT-G program I’ve seen. The result is a program with 25 different My Blocks, interconnected and repeatedly calling each other on multiple levels, instead of 250+ individual Motor (and other) blocks, each of which I would have to set up, and each of which would take up memory. The result may look complex when diagrammed out, but it really isn’t… at least no more so than a "normal" NXT-G program. After all, you don't have to understand what LabVIEW routines are called when you use a certain NXT-G block, or what machine language commands each of those LabVIEW commands is translated to later. In short, I’ve just followed the example set forth by the folks who designed the software – make it modular, and built on things you already know work. The result was the program was “written” in about two days, with about a week of testing (because each testing run had to last something like 2-4 hours).

Please note this is not because I’m a genius, or that I’ve deep insight into the minds of programmers (I’ve never had a programming class in my life), or anything like that. I’m just following where a modular language like NXT-G leads me… and curiously, it’s led me to write code that is far more modular and compact than I ever ended up doing in text-based languages.

OK, that’s it – tomorrow is the launch, and we’ll see what works (hardware, software, etc.)… and what doesn’t. Cross your fingers… and watch the tracking links on the HALE website to see where these payloads are going "as it happens". I'll also be blogging on this tomorrow during the actual mission.

--
Brian Davis

Popular posts from this blog

Celebrating MINDSTORMS with a Remix - Part 3

The ROBOTMAK3RS continued their celebration of the 25th Anniversary of MINDSTORMS through these Fall and Winter remix projects. Each ROBOTMAK3R was tasked with selecting one LEGO set of their choice and combining it with a MINDSTORMS set. Below are the five amazing models they came up with. Braill3 by Jerry Nicholls Braill3 is an EV3-based LEGO Braille bricks reader. This robot uses its fingertip, made from three touch switches, to read messages written using the LEGO Braille bricks and will speak out what it detected. If it sees a simple maths problem it will attempt to solve it and give the answer as well. To learn more about the process of creating this machine, read Jerry's blog . Braill3 can be viewed here . Set Review: The Braille Bricks set is well thought out. The ratios of the letters is suitable for general use and the addition of some punctuation and arithmetic operators is excellent. There is a card showing what bricks there are and their quantities, but no form of sort

Celebrating MINDSTORMS with a Remix - Part 2

The ROBOTMAK3RS continued their celebration of the 25th Anniversary of MINDSTORMS through these summer and fall remix projects. Each ROBOTMAK3R was tasked with selecting one LEGO set of their choice and combining it with a MINDSTORMS set. Below are the five amazing models they came up with. Remote controlled material handle r by Jozua van Ravenhorst (aka Mr Jo) This remix combines the LEGO Technic Material Handler (42144) with MINDSTORMS EV3 (31313) It uses the power of pneumatic cylinders to move objects around. By using a bluetooth remote control, very precise movements can be made with this model. Touch sensors in the base chassis prevent the turret twisting the cables that go through the turntable to much. The program has several protections to prevent over pressurizing the system for each of the 3 individual pumps and valves that control the 2 booms and claws. The real version of this machine is mostly used in waste material sites to bring the material to machines that sort and

Celebrating 25 Years of MINDSTORMS

In celebration of the 25th Anniversary of MINDSTORMS, we take a trip through history. Please also visit ROBOTMAK3RS Community every week as we highlight different projects all through 2023 in celebration of the anniversary. Some of the early history is based on the content shared by  Coder Shah  in our  MINDSTORMS EV3 Community Group . Some of the text and links may have been edited from his original posts for consistency and clarity.  1984 - Kjeld Kirk Kristiansen watched a TV program called "Talking Turtle," where MIT professor Seymour Papert demonstrated how children could control robot "turtles" using LOGO, a programming language he developed. 1988 - The collaboration between MIT and LEGO resulted in LEGO TC Logo in 1988, which allowed students to control LEGO models using computer commands. The video shows Papert demonstrating TC Logo. 1990 - LEGO TC Logo was hampered since the robots you built had to be tethered to a personal computer. LEGO and MIT