Netscapes: Building

Today we focused on finishing developing the physical side of our project.

ledhead

The LED ring powered glass head. Colours are being chosen by a mobile phone (off screen).

The second Neopixel ring arrived, so we soldered in some headers and built a circuit board for it. We installed the necessary drivers into the Arduino IDE and programmed it to output different colours.

We then merged this code with the Bluetooth code Chris made earlier.

We mounted the Arduino, breadboard and Bluetooth module to the interior on the plinth. We drilled holes into the head base to accommodate the wiring for the RGB LED ring.

IMG_20180113_161942

The Arduino & Circuitry mounted inside the plinth.

This LED ring is brighter than the other, even inside the diffuser, so is even better for our finalised build!

IMG_20180113_151014_1.jpg

The LED ring mounted in the plinth.

Our next steps are to:

  • I will develop an app to send characters across a Bluetooth connection, enabling us to remotely change the colour of the head without the need for the Bluetooth Terminal.
  • I need to build server connections for our visualization, input sliders and MySQL Database.
  • Gintaré and Chris need to complete the visualization for the IVT theatre.

 

Advertisements

Everyware: Icebreaker development – LEDs & MQTT

In this post I will briefly outline my part in the creation of our wearable technology: The Icebreaker T-Shirt.

Prototyping: Single RGB LED

I started off by prototyping MQTT connections with a single RGB LED. I built a basic circuit and programmed it so that it would connect to the MQTT broker, and light up when data is received.

IMG_20171125_192722

NodeMCU & RGB LED

RGBW LED Strips

After this, we experimented with using RGBW LED strips. I wired up a circuit (using transistors to prevent burning out my controller board) and experimented with some code to create patterns.

After experimentation, it was apparent these strips were not composed of individually addressable LEDs, so turned out to not be appropriate for our usage.

img_20171121_122214.jpg

Wiring up the RGBW LED strip.

24 RGB LED Ring

Next we moved onto using a 24 LED ring. First, I soldered some pins into the ring for easy wiring. Later these could be removed and either directly soldered onto, or even potentially be connected to using conductive thread.

IMG_20171201_112748

24 LED ring with sample code.

I worked out how to individually address each LED and then used the code from the first single RGB LED to connect it to the MQTT broker.

After this, I setup basic reactions to each input to simulate what will happen once the database is properly connected; Left side lights up when a match is detected to the left .etc

Basic lighting effects/reactions include:

  • Ambient (No match/resting)
  • Match (Different colours for different matches) – Currently 3 max
  • Left – Left side lights up
  • Right – Right side lights up
img_20171210_210009.jpg

Wiring up my NodeMCU board to the LED ring. The circuit requires the use of capacitors to prevent damage to the LED ring. (Tape over LEDs is for diffusing)

The next steps are to create more visually appealing lighting effects & reactions (which I will work on over the next week) and to hook it up to the database via MQTT, which is being handled by my other team members.


 

Links

Adafruit Neopixel/Library: https://learn.adafruit.com/adafruit-neopixel-uberguide?view=all

 

Everyware: Developing wearable technologies

Since I am responsible for designing and building many of the back-end technologies, I looked into how we can not only make the shirt a reality, but also maximize its re-usability.

We considered using technologies such as:

  • Lilypad arduino – a dedicated tool for wearable technologies, the Lilypad will be used to power & control the T-shirt functions.
  • Conductive thread – For connections between Lilypad/LEDs/Sensors, more robust and dedicated for the task (compared to soldering wires, which may snap under stress!)
  • RGB LEDs and LED strips – For creating lighting & light patterns within the T shirt. If strips are used, the LEDs within them need to be individually addressable.
  • Bluetooth or GPS – For tracking proximity to other T-shirts in the area and transmitting data between them.

If we decide to use the Lilypad Arduino over the NodeMCU board, we will have to consider other problems, such as internet connectivity.


Using conductive thread

The best conductive thread for machine or hand sewing is silver plated fiber. It has good sew-ability and a clean finish, and is less likely to get stuck in thread take-up of a sewing machine, unlike stainless steel fibers. For machine sewing, a “z-twist” direction should be used. For hand stitching, however, either type can be used. (Instructables.com, 2017)

To avoid shorts in conductive thread, power and ground lines should be kept a good distance apart. During stitching, fabric should be kept taut and flat . All thread should be tested with a multimeter prior to use. (Stern, 2017)


Waterproofing and wash-ability

Since we are making wearable technologies, we have extra points to consider, such as durability & the ability to wash it.

Washing LED T-shirts
Shirts that have an LED panel should be hand-washed only, to prevent cracking the panel. Battery packs and other water-sensitive parts should be removed prior to washing. Many of these shirts have an interior pocket that allows the wearer to remove the battery pack. (Flashion Statement, 2017)

Shirts that have non-removable electronics are generally dry clean only. (Christmasjumpercompany.co.uk, 2017)

Conductive Thread & Washing

Silver plated fibers are not as suitable for washing as oxidation can occur, however stainless steel fibers can be washed without risk. (Instructables.com, 2017)

The thread has to be dried thoroughly to reduce the risks of shorting (particularly for plated fiber kinds of thread, which may stay damp inside). (Stern, 2017)


Sources:

Flashion Statement. (2017). Washing Instructions – Flashion Statement. [online] Available at: https://www.flashionstatement.com/light-up-t-shirts/washing-instructions/ [Accessed 19 Nov. 2017].

Christmasjumpercompany.co.uk. (2017). Product Care Instructions for your purchase – Christmas Jumper Company 2017. [online] Available at: http://www.christmasjumpercompany.co.uk/care-instructions/4586923675 [Accessed 19 Nov. 2017].

Instructables.com. (2017). Selection Guide of Conductive Thread for Machine Sewing. [online] Available at: http://www.instructables.com/id/Selection-Guide-of-Conductive-Thread-for-Machine-S/ [Accessed 19 Nov. 2017].

Stern, B. (2017). Overview | Conductive Thread | Adafruit Learning System. [online] Learn.adafruit.com. Available at: https://learn.adafruit.com/conductive-thread/overview [Accessed 19 Nov. 2017].

Stern, B. (2017). Preventing short circuits | Conductive Thread | Adafruit Learning System. [online] Learn.adafruit.com. Available at: https://learn.adafruit.com/conductive-thread/preventing-short-circuits [Accessed 19 Nov. 2017].

 

 

Everyware: AI & Emotional Scoring

In order to test the technologies we plan on using for this project, I built a small prototype. This prototype is a reflection of what we plan to make for our final outcome, but on a smaller scale.

The technologies I used were: Amica NodeMCU board, RGB LED, MQTT & IBM Watson services (speech to text & Tone analysis).

How it works

Everyware prototype (1).png

A voice input is taken via an app. The speech input is converted to text using IBM Watson’s Speech-to-text service. This text is then inputted into IBM Watson’s tone analyser, which feeds back an emotional ID (Such as Happy, Sad, Angry .etc) and a percentage score.

This emotional ID/Score data is then then processed in Javascript/Node Red, and published across the MQTT broker on a specific channel.

The NodeMCU board is subscribed to the same channel, and recieves the processed data. This is then used to determine which colour to make the RGB LED it is connected to.

Physical Prototype

I built the basic speech to text app using phonegap, as it is an ideal solution for rapidly protoyping apps that will work on a wide range of mobile devices. It also has dedicated libraries for MQTT connections.

I programmed the NodeMCU board to receive the data from the MQTT and use that to determine what colour to make the RGB LED. Since the tone ID & score were simplified into integers earlier, all it has to do is take the returned number and use it to control the colour and brightness, such as turning it blue for sadness, and making it brighter for a high percentage.

IMG_20171125_192722

NodeMCU with RGB LED

Screenshot_2017-11-25-19-10-44

RGB LED turns red when Watson’s Tone analyser detects Anger

Art & the Internet of Things

Immaterials
By Timo Arnall, Einar Sneve Martinussen & Jack Schulze

Immaterials is a collection of pieces centered around the increasing use of ‘invisible interfaces’ such as WiFi and mobile networks, and the impact they have on us. (Arnall, 2013)

Immaterials: Light Painting & WiFi explores the scale of WiFi networks in urban spaces, and translates signal strength into unique light paintings.

Immaterials: Light painting WiFi  (Arnall, 2011)

Immaterials also utilises a series of satellite sensitive lamps that change light intensity according to the strength of GPS signals reveived. (Arnall, 2013)


The Nemesis Machine
By Stanza

a_stanza_3987

The Nemesis Machine in exhibition (Stanza, n.d.)

The Nemesis Machine is a travelling installation artwork. It uses a combination of Digital Cities and IOT technology. It visualises life in the city based off real time data from wireless sensors, representing the complexities of cities and city life. (Stanza.co.uk, n.d.)


ExTouch
By Shunichi Kasahara, Ryuma Niiyama, Valentin Heun & Hiroshi Ishii

Incorporates touchscreen interactions into the real world. Users can touch objects shown in live video; dragging them across the screen and across physical space. (Kasahara et al., 2012)


exTouch in action (exTouch, 2013)


T(ether)
By Dávid Lakatos, Matthew Blackshaw, Alex Olwal, Zachary Barryte, Ken Perlin & Hiroshi Ishii

T(ether) is a platform for gestural interaction with objects in digital 3D space, with a handheld device acting as a window into virtual space. T(ether) has potential as a platform for 3D modelling and animation. (Lakatos et al., 2012)


Sources

IMMATERIALS

Arnall, T. (2013). The Immaterials Project. [online] Elastic Space. Available at: http://www.elasticspace.com/2013/09/the-immaterials-project [Accessed 1 Nov. 2017].

Arnall, T. (2011). Immaterials: Light Painting WiFi. [Video] Available at: https://vimeo.com/20412632 [Accessed 1 Nov. 2017].

NEMESIS MACHINE

Stanza (n.d.). The Nemesis Machine Installation. [image] Available at: http://www.stanza.co.uk/nemesis-machineweb/index.html [Accessed 1 Nov. 2017].

Stanza.co.uk. (n.d.). The Nemesis Machine – From Metropolis to Megalopolis to Ecumenopolis. A real time interpretation of the data of the environment using sensors.. [online] Available at: http://www.stanza.co.uk/nemesis-machineweb/index.html [Accessed 1 Nov. 2017].

EXTOUCH

Kasahara, S., Niiyama, R., Heun, V. and Ishii, H. (2012). exTouch. [online] Tangible.media.mit.edu. Available at: http://tangible.media.mit.edu/project/extouch/ [Accessed 1 Nov. 2017].

exTouch. (2013). [Video] MIT Media Lab: MIT Media Lab. Available at: https://vimeo.com/57514726 [Accessed 1 Nov. 2017].

T(ETHER)

Lakatos, D., Blackshaw, M., Olwal, A., Barryte, Z., Perlin, K. and Ishii, H. (2012). T(ether). [online] Tangible.media.mit.edu. Available at: http://tangible.media.mit.edu/project/tether/ [Accessed 1 Nov. 2017].

Netscapes: Internet Art

eCLOUD
By Aaron Koblin, Nik Hafermaas & Dan Goods

ecloud-02

eCLOUD installation & display (Koblin, n.d.)

eCLOUD is an installation piece consisting of polycarbonate tiles that fade between being opaque and transparent based on real-time weather data. It has an accompanying display placed at eye level. The piece is permanently housed at San Jose International Airport. (Postscapes.com, n.d.)


Thaw
By Sand-won Leigh, Philipp Schoessler, Felix Heibeck, Pattie Maes, Hiroshi Ishii

THAW: Hybrid Interactions with Phones on Computer Screens from Tangible Media Group (MIT Tangible Media Group, 2014)

 

Thaw is an interaction system that bridges the gap between a handheld device and a large display. The handheld device is used as a means to manipulate objects on the display. Position tracking is achieved by using the smartphone’s back facing camera. (Leigh et al., 2012)


Sources:

ECLOUD

Koblin, A. (n.d.). eCLOUD installation & display. [image] Available at: http://www.aaronkoblin.com/project/ecloud/ [Accessed 1 Nov. 2017].

Postscapes.com. (n.d.). IoT Art – Real Time Networked Art Installations. [online] Available at: https://www.postscapes.com/networked-art-10-projects-using-real-time-data/ [Accessed 1 Nov. 2017].

THAW

MIT Tangible Media Group (2014). THAW: Hybrid Interactions with Phones and Computer Screens. Available at: https://vimeo.com/105950126 [Accessed 1 Nov. 2017].

Leigh, S., Schoessler, P., Heibeck, F., Maes, P. and Ishii, H. (2012). THAW. [online] Tangible.media.mit.edu. Available at: http://tangible.media.mit.edu/project/thaw/ [Accessed 1 Nov. 2017].

Everyware: The Matter of the Immaterial

The brief for “Everyware” is entitled “The Matter of the Immaterial”, and is focused around ubiquitous computing and making the intangible tangible. I took this idea and used it as a starting point for some research into what is already available.


Inspirations:

Ultrahaptics

virtual-reality-ultrahaptics-vr

Ultrahaptics development kit (Ultrahaptics, 2015)

Ultrahaptics is a startup company focused on making the virtual world physical. Using an array of ultrasonic projectors and hand tracking, users can feel & interact with virtual environments, as well as feel real tactile feedback without the need for wearing or holding special equipment. (Ultrahaptics, 2017) Read more on my other blog post.

rsz_142

Ultrahaptics Diagram  (Ultrahaptics, 2015)

Ultrahaptics follows a similar concept to the Geomagic Touch X 3D pen (Previously known as Sensable Phantom Desktop), which I have used!


DaisyPi

DaisyPi

DaisyPi system (DaisyPi, 2017)

The Daisy Pi is a Raspberry Pi powered home monitoring system. It is fitted with multiple sensors including temperature, light intensity and humidity. It is also capable of capturing audio and video feeds, which can be accessed remotely by devices such as mobile phones or tablets. (Lopez, 2017)


Moon 

ai-weiwei-olafur-eliasson-moon-designboom-03

Moon up close (Designboom, 2014)

Moon is an interactive installation piece created by Olafur Eliasson and Ai Weiwei. It invites viewers from around the globe to draw and explore a digital “Moonscape”. (Feinstein, 2014)

Eliasson and Weiwei’s work is focused around community and the link between the online and offline world. (Austen, 2013)

Over the course of its 4 years of existence, Moon grew from simple doodles and drawings, to collaborations & clusters of work, such as the “Moon Elisa”, where multiple users came together to recreate the classic Mona Lisa painting. (Cembalest, 2013)

“The moon is interesting because it’s a not yet habitable space so it’s a fantastic place to put your dreams.” – Olafur Eliasson, on Moon (Feinstein, 2014)


Illuminating Clay

Illuminating Clay is a platform for exploring 3D spatial models. Users can manipulate the clay into different shapes (even adding other objects), and using a laser scanner and projector, a height map is projected back onto the surface. It can also be used to work out data such as travel times and land erosion.  (Piper et al., 2002)


Physical Telepresence

1444925735431983

Interaction through Physical Telepresence (Vice, 2015)

Physical Telepresence is a work created by students at MIT, based around shared workspaces and remote manipulation of physical objects. (Leithinger et al., 2014) The work consists of a pin-based surface that can be used to interact with physical objects. (Pick, 2015)


Near Field Creatures

Near Field Creatures is a game made by students as a part of the mubaloo annual appathon at Bristol Uni. Users scan NFC tags (such as in certain student cards) and collect different animals of differing values. These collected animals can then be used to compete with other users. (Mubaloo, 2015)

Pico
Pico is an interactive work that explores human-computer interaction, allowing people and computers to collaborate in physical space. Pico is interacted with by use of pucks, which can be used by both the computer and the user. (Patten, Alonso and Ishii, 2005)

PICO 2006 from Tangible Media Group on Vimeo. (Pico 2006, 2012)

 


Sources:

ULTRAHAPTICS

Ultrahaptics (2015). Ultrahaptics Development Kit. [image] Available at: http://www.ibtimes.co.uk/ultrahaptics-bringing-sensation-touch-virtual-reality-1489289 [Accessed 28 Oct. 2017].

Ultrahaptics. (2017). Ultrahaptics – A remarkable connection with technology. [online] Available at: https://www.ultrahaptics.com/ [Accessed 28 Oct. 2017].

Ultrahaptics (2015). Ultrahaptics diagram. [image] Available at: http://electronics360.globalspec.com/article/5907/touch-control-with-feeling [Accessed 28 Oct. 2017].

DAISYPI

DaisyPi (2017). Daisy Pi Unit. [image] Available at: https://www.slideshare.net/howtoweb/valerian-banu [Accessed 28 Oct. 2017].

Lopez, A. (2017). Daisy Pi | The home monitoring e-flower. [online] Daisypi.ro. Available at: http://daisypi.ro/ [Accessed 28 Oct. 2017].

MOON

Designboom (2014). Moon close up. [image] Available at: https://www.designboom.com/art/ai-weiwei-olafur-eliasson-give-rise-to-moon-interactive-artwork-11-26-2013/ [Accessed 30 Oct. 2017].

Feinstein, L. (2014). Make Your Mark On The Moon With Olafur Eliasson and Ai Weiwei. [online] Creators. Available at: https://creators.vice.com/en_uk/article/yp5zkj/make-your-mark-on-the-moon-with-olafur-eliasson-and-ai-weiwei [Accessed 30 Oct. 2017].

Cembalest, R. (2013). How Ai Weiwei and Olafur Eliasson Got 35,000 People to Draw on the Moon | ARTnews. [online] ARTnews. Available at: http://www.artnews.com/2013/12/19/how-ai-weiwei-and-olafur-eliasson-got-35000-people-to-draw-on-the-moon/ [Accessed 30 Oct. 2017].

Austen, K. (2013). Drawing on a moon brings out people’s best and worst. [online] New Scientist. Available at: https://www.newscientist.com/article/dn24702-drawing-on-a-moon-brings-out-peoples-best-and-worst/ [Accessed 30 Oct. 2017].

ILLUMINATING CLAY

Piper, B., Ratti, C., Wang, Y., Zhu, B., Getzoyan, S. and Ishii, H. (2002). Illuminating Clay. [online] Tangible.media.mit.edu. Available at: http://tangible.media.mit.edu/project/illuminating-clay/ [Accessed 30 Oct. 2017].

PHYSICAL TELEPRESENCE

Vice (2015). Interaction with Physical Telepresence. [image] Available at: https://motherboard.vice.com/en_us/article/ae3598/watch-a-robotic-floor-play-with-blocks [Accessed 30 Oct. 2017].

Leithinger, D., Follmer, S., Olwal, A. and Ishii, H. (2014). Physical Telepresence. [online] Tangible.media.mit.edu. Available at: http://tangible.media.mit.edu/project/physical-telepresence/ [Accessed 30 Oct. 2017].

Pick, R. (2015). Watch a Robotic Floor Play with Blocks. [online] Motherboard. Available at: https://motherboard.vice.com/en_us/article/ae3598/watch-a-robotic-floor-play-with-blocks [Accessed 30 Oct. 2017].

NFC CREATURES

Mubaloo. (2015). Mubaloo and Bristol University hold third annual Appathon. [online] Available at: http://mubaloo.com/mubaloo-bristol-university-hold-third-annual-appathon/ [Accessed 28 Oct. 2017].

PICO

Patten, J., Alonso, J. and Ishii, H. (2005). PICO. [online] Tangible.media.mit.edu. Available at: http://tangible.media.mit.edu/project/pico/ [Accessed 30 Oct. 2017].

Pico 2006. (2012). MIT: MIT Tangible Media Group. Available at: https://vimeo.com/44539342

 

Everyware: Programming AI & IOT

  1.  NodeMCU and MQTT
    First, I wired a NodeMCU board wired up to a rain sensor and an LED. I connected the board up to the MQTT broker, so whenever a certain amount of rain was detected, a message would be published.

    nodeMCUboard

    NodeMCU board, with Sensors & LED connected

    Next, I programmed the board to receive messages back from the broker. I connected the board up to a single LED, which would blink when a message with a specific payload was received.

  2.  Image Recognition AI
    I used IBM Watson to create an image recognition system capable of categorizing different dog breeds. Images of dogs (or non-dogs!) can be fed into the system, and the AI reads back information as to what kind of dog it is by percentage likeness.

    did2

    IBM Watson recognition

    Next, I connected this up to the MQTT broker, so when a dog breed is detected, it sends a message to the board detailing which breed of dog was seen.

  3.  Chatbots
    I made a basic chatbot capable of taking commands such as “switch on the lights” or “open the window”, and connected it to a personal Slack channel for testing. I then hooked it up to the MQTT broker, and later to the NodeMCU board

    dialoguebot

    Asking the bot to turn on the light using Slack.

    When you ask the chatbot to “turn on the lights”, the message gets sent to the MQTT broker back to the NodeMCU board, which then turns on an LED. This also works for other commands, such as “Open the window”, which spins a small servo.

    LED on

    LED comes on