Netscapes: Insight – Final Presentation

Insight: The Big Five is an interactive artwork inspired by psychology and personality. Users are invited to take a short personality test on the in-built touch screen, and see their results displayed in real-time within the Immersive Vision Theatre (IVT). The glass head also reacts to inputted data with a variety of lighting effects.

IMG_20180123_161751.jpg

User inputs

IMG_20180123_162816.jpg

User input with changing visuals

insightfinalpres.gif

User inputs

 

 

Advertisements

Netscapes: Insight – IVT Testing

Today we did our final build and first live test in the Immersive Vision Theatre (IVT). We started by fitting the Raspberry Pi and touchscreen inside the plinth, then transporting the equipment to the dome ready for our presentation.

IMG_20180122_160933

Fitting Pi3 + Touchscreen

Chris added wooden beams to support the weight of the Pi, as it will be under a lot of pressure when the touchscreen is in use. This should prevent the touchscreen moving away from the pinth.

IMG_20180122_150137.jpg

Setting up in the IVT – Modifying Code

Whilst in the IVT, Gintare updated her code to work better within the shape of the screen. She moved some of the key elements of the visuals so they were more centered within the dome, bringing them to the viewer’s attention.

 

vizlaptop.png

Setting up the visualization

We transported the physical part of our project to the IVT and decided where to set it up. We then tested the project within the space to understand how it will look and feel to the viewers and how the colours will display in the dome.

head interface.png

Glass head with touchscreen interface

We took this as an opportunity to double-check our database connections were working. During this time we ran into issues with page refreshing (which I quickly resolved) and with internet connection, which we resolved by using a mobile access point.

headdemo.png

Glass head interface in front of the projection.

We even invited Luke to test out our user interface, and have a go at inputting his own data into the visualization!

head interaction.png

Luke testing out the user interface!

dometest

Head test with visualization within the dome.

Netscapes: Building Bluetooth Connections – Part 2

Today we had access to the physical side of the project, so I tested my Bluetooth code (see my previous post) with the Arduino side. Luckily, after pairing with the HC-05 Bluetooth component, the code worked first time without need for debugging!

IMG_20180122_160927

The Arduino side, with HC-05 Bluetooth component & Neopixel ring

Chris and I modified the Arduino code to output different lighting effects based on the character sent across Bluetooth. We decided on the default being Red, with a breathing effect (which I created for a previous project) and a rainbow spin effect.

LEDbluetooth

Bluetooth message sent on tapping “Generate”

How it works

  • When the local server is started, it searches through paired devices to find the HC-05 module.
  • When it is found, it opens a connection and sends it the instruction to turn on.
  • When the generate button is pressed, a new message is sent across the connection instructing it to run the rainbow effect.

Critical analysis/Reflection

To begin with, we were going to use a separate mobile app to input user data across Bluetooth to the Arduino. Switching instead to using the same input as the user data adds a level of interactivity than we would have previously had from a separate phone app. It allows a user to instantly see the effect their inputs have had even before the visualization updates.

This also ties the piece together better, making it an all-in-one system rather than being split up.

Future Improvements

If we had more time, I would modify the code to react differently depending on some of the user inputted data, such as changing colours or effects based on values.

 

 

Everyware: The Icebreaker

Presentation Link: https://prezi.com/view/cvW3ewB4za09OKP7Hktq/

The Icebreaker T-shirt is a unique tool for connecting people. It is designed for use in conventions, open days, and other large social events. Plymouth University alone has approximately 23,000 students, with many of these students attending open days and freshers fairs. (University of Plymouth, 2018) With the phrase “It’s not what you know but who you know” being more true than ever; it is vital to grow your social circle and create opportunities for both now and the future.

26972401_10160029255395790_1364697445_o

Icebreaker mobile app

The Icebreaker mobile app accompanies the T-shirt, allowing users to input their login details as well as add in their hobbies and interests to match upon. The app is simple, allowing you to input your details and go, so your attention is always in the moment, rather than at your phone.

Watch our promotional video: https://www.youtube.com/watch?v=xVB_mYe-LQw

The interactive T-shirt makes the immaterial concept of shared interests and brings them to life in a visual way. When you meet someone you have similar interests to, your shirt will light up, telling you exactly who you have matched with and what interests you match on. The accompanying app tells you exactly how close the person you matched with is, so you never miss an opportunity to meet someone.

T-shirt designs

Icebreaker Tshirt designs

The Icebreaker T-shirt allows wearers to expand their social circle and meet new people – Perfect for university freshers fairs and open days.

Security features

Don’t want to match with anyone? Only want to match with people of a similar age? Don’t want to share your location? No problem. With the Icebreaker mobile app, you can choose from a range of privacy options.

Future updates include:

  • Personalisation: Customisation of colours and lighting effects based on preferences in the mobile app.
  • More Matches: A more extensive list of hobbies and interests to choose from.
  • Washable: Switching to washable conductive threads & removable LED panels, allowing the T-shirt to be reused

References

University of Plymouth. (2018). Facts and figures. [online] Available at: https://www.plymouth.ac.uk/your-university/about-us/facts-and-figures [Accessed 19 Jan. 2018].

 

Netscapes: Building

Today we focused on finishing developing the physical side of our project.

ledhead

The LED ring powered glass head. Colours are being chosen by a mobile phone (off screen).

The second Neopixel ring arrived, so we soldered in some headers and built a circuit board for it. We installed the necessary drivers into the Arduino IDE and programmed it to output different colours.

We then merged this code with the Bluetooth code Chris made earlier.

We mounted the Arduino, breadboard and Bluetooth module to the interior on the plinth. We drilled holes into the head base to accommodate the wiring for the RGB LED ring.

IMG_20180113_161942

The Arduino & Circuitry mounted inside the plinth.

This LED ring is brighter than the other, even inside the diffuser, so is even better for our finalised build!

IMG_20180113_151014_1.jpg

The LED ring mounted in the plinth.

Our next steps are to:

  • I will develop an app to send characters across a Bluetooth connection, enabling us to remotely change the colour of the head without the need for the Bluetooth Terminal.
  • I need to build server connections for our visualization, input sliders and MySQL Database.
  • Gintaré and Chris need to complete the visualization for the IVT theatre.

 

Netscapes: Development Process

Creating a project is an organic process that contains many twists and turns. Below I will outline some of the changes we had to make during the development of our project.

Before Christmas break, Chris built a wooden plinth to mount the glass head on & house all the electronics. He also designed & 3D printed an inner diffuser for our lighting. This will be displayed in the IVT theatre, as the interactive front-end of our project.

IMG_20180109_164348.jpg

Glass head mounted on plinth (without diffuser). The gap at the front will house the Raspberry Pi & GPIO Touchscreen.

Modified Slider/LED control for Arduino

To further improve the LED lighting part of our piece, we decided to modify it by removing the serial connection and instead using a Bluetooth connection. Chris purchased a Bluetooth module and began to program it to take inputs from mobile.

Chris and I worked together to program the RGB LED code with Bluetooth. We tested the connection using Bluetooth terminal on our Android devices; sending simple “a” and “b” messages to turn an LED on and off remotely. We discovered that this will only work with one device at a time, so we will need to account for this when the model is on display.

We decided on making a mobile app to control the colour of the LEDs, which I will build in Processing over the next few days.

Resolving LED brightness issues 

We found that with the 3D printed inner diffuser in place, The RGB LEDs were not bright enough to light up the glass head.

IMG_20180109_151940

Original setup, with multiple RGB LEDs, Arduino & Bluetooth module.

img_20180121_200116.jpg

Neopixel 24 LED Ring

We tried an LED ring (that I have been using in another project) since it is considerably brighter than the individual LEDs. This worked much better; the colour was visible even in the brightly lit room, and the ring diameter was a perfect fit for the base of the diffuser!

IMG_20180112_132148

Glass head with diffuser and LED ring.

We purchased another LED ring and cut new holes in the mount to accommodate the wiring.

Switching and Setting up databases.

Due to issues connecting to our MongoDB database, we decided to switch from MongoDB to MySQL.

I set up a new database on my server with access to the necessary tables. I sent Gintaré a link to instructions on how to set it up, along with the necessary details, so she can get to work building the data visualization.

Next Steps
Our next steps are to:

  • Wire up the LED ring and program it to respond to Bluetooth messages (modifying earlier code)
  • Develop an android app
  • Connect the visualization and the slider inputs to my server/Database.

 

Everyware: MQTT matches & Lighting effects

Gerrit redesigned the T-shirt from the original polar bear design to a design which will help alert users to which interests they matched on, as seen below. This will match the size of the LED ring once it is fitted onto the T-shirt, allowing a quick visual reference to which interests or hobbies the wearers have matched on.

received_10159835159540181 (1).jpeg

Gerrit’s T-shirt design, made to accommodate the LED ring within the design

Since my part of the project includes handling the physical prototyping with NodeMCU, I had to change my code to work with this design.

Creating Lighting Effects – Neopixel 24

The new T-shirt design incorporates a thirds system to allow users to identify which interests they matched on.

IMG_20180116_212159.jpg

For single matches, only specific parts of the board will light up.

I changed my code to accommodate this thirds system – I split the LED ring up into thirds relating to which match was made. Since the Neopixel Library allows you to set individual LED colours based on their number, it is possible to count around the ring between specific numbers.

Although the LED ring will need to be ‘reset’ to off between animations, the left/right match code can otherwise stay the same.

ledring

Basic match with direction

This visual code works not only for the wearer but for other wearers too – It acts as a quick visual reference to allow you to instantly see who you matched with!

I also made my own functions for lighting effects such as breathe effects and directional matches. The breathe effect cycles through brightness as a percentage (in a similar way to setting the brightness in the code setup). Whilst this works, it does require perfect timing on delays, as if you miss the timing the LED ring will not go back to full brightness. Percentages should also be used as to not overwrite the initially set brightness of the LED ring (to avoid it turning down to ‘0’ brightness and staying there!)

Final Lighting Effects

I modified my code to show more interesting lighting effects and fix some issues that I previously came across.

One of the main issues I noticed was that when a match was made, if the ‘background’ was left as the default blue the matches were not as easy to notice (for example, if two matches were made, the last third would stay blue, so it could  be interpreted as 3 matches instead of 2). I combated this my programming the LEDs to switch off after matches and between directional animations. This also has the added bonus of being more eye-catching to both users and viewers!

effects.gif

Double match with direction effects

Matches count across one third of the ring (always in sequence, so the motion is a constant clockwise direction for a smoother appearance) If there are multiple matches, this will count across all of the thirds in sequence (as seen above). Once it has finished counting round, the ring lighting will “Breathe” for one cycle.

After this, the directional lighting will begin. This lights up half of the ring white and green. The effect means that the ring will appear to light up white with a green “arrow” moving towards the center in whichever direction the match is. To achieve this, I programmed the LEDs individually to delay the change in colour. This animation will run for 4 cycles.

Handling Multiple devices across MQTT

In order to keep it efficient, each board takes a unique ID and then an instruction, then a direction. for example “1” for board 1, followed by “1” for match one and “2” for match right. at the same time, board 2 would receive “2” followed by match “1” and “1” for match left.

tshirttable

MySQL database, where the interests are stored

When this is hooked up to Glenn’s MySQL and Node-Red code, these sets of instructions are sent to the relevant T-shirts when a match is made within a certain location. This location is worked out using the Google Maps API (also used to determine the Left/Right directional matches).

simmatch

2 LED rings matching on all 3 interests. Note the motioning towards each other. (Delay due to internet!)

 

Netscapes: Building – MongoDB & Python

This week I have focused on the building stage. After helping my team members get started with p5.js, I got to work building my parts of the project: the back-end and LED control.


Emotion/colour Sliders – Python & Arduino LED control

Part of our project includes the representation of emotions in a visual sense. We decided on creating this using a basic slider input, so I got to work developing it.

I built this using:

  • Raspberry Pi 3
  • Arduino
  • 5″ GPIO touchscreen
  • Python

I created my app using libraries including PySerial (for serial connections using Python) and tkinter (For rapid production of basic user interfaces). I decided to use Python as I have previous experience with creating connections to Arduino using PySerial.

Building circuits

Firstly, I setup the Raspberry Pi 3 with the necessary drivers & fitted the touchscreen. I created a script on the desktop to open an on-screen keyboard (so I wouldn’t have to use a keyboard for setup later). I then built a basic circuit with an RGB LED and hooked it up to the Raspberry Pi.

IMG_20171208_003440 (1)

My Rasberry Pi with GPIO touchscreen.

Programming

I started off by building a basic slider interface using tkinter and Python. I made sure it was appropriately scaled to the screen and then worked out how to get the output of the slider in real time. In this case, I used 1 to 8 to match our data input app.

IMG_20171210_195700

Basic RGB LED wiring

Once the slider was working correctly, I set up a basic serial connection to the arduino using PySerial. Since PySerial needs data to be sent in bytes, I had to make sure the characters sent were encoded. I then built a basic script on the Arduino side to receive the characters and change the colour of the RGB LED based on how far the slider was moved (in this case blue to yellow for sad to happy).

Link to my code on GitHub: https://github.com/Mustang601/Arduino-Serial-LED-slider 

Sequence 01_1

My completed LED slider

My next steps are to further develop the user interface, and to figure out how to use this in conjunction with the other user inputs (for database connection).


MongoDB

I created a database in MongoDB, and hosted it on mLabs (due to a software conflict I couldn’t easily host it on my own server, so this was the next best thing!)

The database will hold all the input data from our app; and will be used in the creation of our visualization.

mongo input document

Document within MongoDB database

The next step is to connect this database up to the input app and visualization once they are both completed.


Related Links

tkinter: https://docs.python.org/3.0/library/tkinter.html 

PySerial: https://pythonhosted.org/pyserial/

mLabs: https://mlab.com/

MongoDB: https://www.mongodb.com/

Everyware: Icebreaker development – LEDs & MQTT

In this post I will briefly outline my part in the creation of our wearable technology: The Icebreaker T-Shirt.

My responsibilities included creating the physical prototypes: Using the NodeMCU and MQTT connections to build a wearable system.

Prototyping: Single RGB LED

I started off by prototyping MQTT connections with a single RGB LED. I built a basic circuit and programmed it so that it would connect to the MQTT broker, and light up when data is received.

This was a very basic prototype to test MQTT connections and reactions, to see what is possible across MQTT communications and how messages are both sent and received.

During this time, I ran into many internet connectivity issues with the NodeMCU board – Whilst it does work, it often needs resetting as the connection is prone to drop out after running for a while. This is a known limitation of the board and a workaround should  be found for the finished product.

IMG_20171125_192722

NodeMCU wired up to the RGB LED for testing messages sent across the MQTT i-dat broker.

RGBW LED Strips

After this, we experimented with using RGBW LED strips. I wired up a circuit (using transistors to prevent burning out my controller board) and experimented with some code to create patterns and colour combinations.

img_20171121_105614.jpg

Testing the RGB LED strip

In addition to issues with powering these strips from the NodeMCU board without external power sources, it was apparent these strips were not composed of individually addressable LEDs, so turned out to not be appropriate for our usage. It would also be difficult to fit these LED strips around a T-shirt design without cutting them in multiple places and soldering many wires to bridge the gaps, resulting in a messy and impractical finish.

img_20171121_122214.jpg

Wiring up the RGBW LED strip with multiple transistors – one for each colour & white.

 

24 RGB LED Ring

Next we moved onto using an Adafruit Neopixel 24 LED ring. First, I soldered some header pins into the ring for easy wiring. Later these could be removed and either directly soldered onto, or even potentially be connected to using conductive thread.

IMG_20171201_112748

24 LED ring with sample code.

I worked out how to individually address each LED to both change the colour and brightness, and then used the code from the first single RGB LED (shown above) to connect it to the MQTT broker.

After this, I setup basic reactions to each input to simulate what will happen once the database is properly connected; such as changing the colours when a match is detected.

Basic lighting effects/reactions i initially created include:

  • Ambient (No match/resting)
  • Match (Different colours for different matches) – Currently 3 max
  • Left – Left side lights up
  • Right – Right side lights up
img_20171210_210009.jpg

Wiring up my NodeMCU board to the LED ring. The circuit requires the use of capacitors to prevent damage to the LED ring and resistors on the data inputs. (Tape shown over LEDs is for diffusing)

The next steps are to create more visually appealing lighting effects & reactions (which I will work on over the next week) and to hook it up to the database via MQTT, which is being handled by my other team members.


 

Links

Adafruit Neopixel/Library: https://learn.adafruit.com/adafruit-neopixel-uberguide?view=all

 

Everyware: AI & Emotional Scoring

In order to test the technologies we plan on using for this project, I built a small prototype. This prototype is a reflection of what we plan to make for our final outcome, but on a smaller scale.

Since I am responsible for designing how the NodeMCU prototyping will work (and later developing it), I put a lot of work into designing how the software will work before developing it:

mqtt design.jpg

Early design of how the MQTT connection will work in conjunction with IBM Watson’s tone analyser

The technologies I used were: Amica NodeMCU board, RGB LED, MQTT & IBM Watson services (speech to text & Tone analysis).

How it works

Everyware prototype (1).png

A voice input is taken via an app. The speech input is converted to text using IBM Watson’s Speech-to-text service. This text is then inputted into IBM Watson’s tone analyser, which feeds back an emotional ID (Such as Happy, Sad, Angry .etc) and a percentage score.

This emotional ID/Score data is then then processed in Javascript/Node Red, and published across the MQTT broker on a specific channel.

The NodeMCU board is subscribed to the same channel, and recieves the processed data. This is then used to determine which colour to make the RGB LED it is connected to.

Physical Prototype

I built the basic speech to text app using phonegap, as it is an ideal solution for rapidly protoyping apps that will work on a wide range of mobile devices. It also has dedicated libraries for MQTT connections.

I programmed the NodeMCU board to receive the data from the MQTT and use that to determine what colour to make the RGB LED. Since the tone ID & score were simplified into integers earlier, all it has to do is take the returned number and use it to control the colour and brightness, such as turning it blue for sadness, and making it brighter for a high percentage.

IMG_20171125_192722

NodeMCU with RGB LED

Screenshot_2017-11-25-19-10-44

RGB LED turns red when Watson’s Tone analyser detects Anger