Skip to main content

Junk Science: Taking My Temperature with the NXT


To see how NXT-2.0 and the new temp sensor performed, I ran the following test. (Click on the graphic for an enlarged view).

1) The bottom bar graph shows the temperature in our living room: 65.4 degrees Fahrenheit, proof of our extremely low natural gas consumption. (This has its downside, as you'll see from the next graph).

2) The middle graph shows the temp when my hand is wrapped around the temp sensor. (I didn't hold it long enough to actually warm it up, but 80 degrees Fahrenheit is a frighteningly low temperature for a body extremity).

3) The top graph shows the temp when the sensor is placed next to my body, underneath my shirt and fleece pullover. (It was only there for about 30 seconds, but an 87 degree body temperature indicates severe hypothermia--I'm heading for the fireplace).

I did establish a Bluetooth connection with the new software and firmware, so the next step is to test remote data logging. I want to test the exportable MyBlocks feature as well. More later.

Update: I thought I'd toss up another graph to show the sensors time response.















Here I took
one 10 minute long run, first putting the probe under my arm, then under my tongue (not recommended!), and then after a short cool-off into a cup of hot tap water, & finally directly into a cup of ice water. I could even read off values, annotate, and rescale all in the software... and I did all this in about 15 minutes (including the 10 minutes for logging). It really makes it easy. This is showing only a portion of the things you can do with this new extension to the environment - hiding data, multiple sensor or multiple datasets, zooming on portions of the graph, and even built-in screenshots and saving data. Very handy. And no, I don't have a fever, I'm just warmer than Rick is ;).

--
Brian Davis

Popular posts from this blog

Celebrating MINDSTORMS with a Remix - Part 3

The ROBOTMAK3RS continued their celebration of the 25th Anniversary of MINDSTORMS through these Fall and Winter remix projects. Each ROBOTMAK3R was tasked with selecting one LEGO set of their choice and combining it with a MINDSTORMS set. Below are the five amazing models they came up with. Braill3 by Jerry Nicholls Braill3 is an EV3-based LEGO Braille bricks reader. This robot uses its fingertip, made from three touch switches, to read messages written using the LEGO Braille bricks and will speak out what it detected. If it sees a simple maths problem it will attempt to solve it and give the answer as well. To learn more about the process of creating this machine, read Jerry's blog . Braill3 can be viewed here . Set Review: The Braille Bricks set is well thought out. The ratios of the letters is suitable for general use and the addition of some punctuation and arithmetic operators is excellent. There is a card showing what bricks there are and their quantities, but no form of sort...

Celebrating MINDSTORMS with a Remix - Part 2

The ROBOTMAK3RS continued their celebration of the 25th Anniversary of MINDSTORMS through these summer and fall remix projects. Each ROBOTMAK3R was tasked with selecting one LEGO set of their choice and combining it with a MINDSTORMS set. Below are the five amazing models they came up with. Remote controlled material handle r by Jozua van Ravenhorst (aka Mr Jo) This remix combines the LEGO Technic Material Handler (42144) with MINDSTORMS EV3 (31313) It uses the power of pneumatic cylinders to move objects around. By using a bluetooth remote control, very precise movements can be made with this model. Touch sensors in the base chassis prevent the turret twisting the cables that go through the turntable to much. The program has several protections to prevent over pressurizing the system for each of the 3 individual pumps and valves that control the 2 booms and claws. The real version of this machine is mostly used in waste material sites to bring the material to machines that sort and...

Celebrating 25 Years of MINDSTORMS

In celebration of the 25th Anniversary of MINDSTORMS, we take a trip through history. Please also visit ROBOTMAK3RS Community every week as we highlight different projects all through 2023 in celebration of the anniversary. Some of the early history is based on the content shared by  Coder Shah  in our  MINDSTORMS EV3 Community Group . Some of the text and links may have been edited from his original posts for consistency and clarity.  1984 - Kjeld Kirk Kristiansen watched a TV program called "Talking Turtle," where MIT professor Seymour Papert demonstrated how children could control robot "turtles" using LOGO, a programming language he developed. 1988 - The collaboration between MIT and LEGO resulted in LEGO TC Logo in 1988, which allowed students to control LEGO models using computer commands. The video shows Papert demonstrating TC Logo. 1990 - LEGO TC Logo was hampered since the robots you built had to be tethered to a personal computer. LEGO and MIT...