Archive for the ‘Emote’ Category

Experiencing Self vs. Remembering Self

Monday, April 8th, 2013

Two years ago, while finishing up my studies at ITP I was working on a thesis project called Emote [link to video | link to paper]. The intended goal of this project was to develop a prototype for a system or platform that would support “my practices for nurturing emotional attentiveness… [a platform that could help me] bring awareness to my emotional processes, identify my emotional triggers and scripts, distinguish constructive and harmful emotions, and nurture constructive emotions.” Ultimately, I wanted to create a system that would help live a more fulfilling life. A tall order I know.

Thinking_Fast_SlowOver the last few days I have been reading the last section of Daniel Kahneman’s book Thinking Fast and Slow. In this part of the book, Daniel focuses on exploring our “Two Selves”, or the two different ways that we experience the world. The perspective that he puts forward here has given me whole new perspective on my thesis project and has helped me identify the faulty assumptions which clouded my thinking. Though I have not continued working on Emote, I have not lost my passion for exploring ideas related to pursuits for meaning, awareness and consciousness, nor my interesting in exploring how technology (digital or other) can help us achieve this goal.

Kahneman posits that we experience the world in two different and conflicting ways – as the experiencing self and the remembering self. Here is a brief description about how each of these work:

The Experiencing Self
The experiencing self refers to the way in which we experience the world while the experience is taking place, in the moment itself. For example, the way we experience pain while undergoing a medical procedure, or pleasure while sharing an intimate sexual encounter. This part of ourselves experiences reality in a moment by moment basis – duration and time are central aspects of the experience. As such, from the standpoint of the experiencing self, to improve our well-being we should strive to extend the duration of pleasurable moments, while minimizing the moments of suffering.

The Remembering Self
The remembering self refers to the way in which we experience things after they have taken place, through the prism of our memories. For example, the way we experience a medical procedure or intimate sexual encounter through the stories we create about these events after they have taken place. Due to the characteristics of our memory, this mode of experience emphasizes moments of peak of intensity and end moments. For these reasons, our memories don’t properly reflect the time or duration of events. This means that from the standpoint of the remembering self, to improve our well-being we should strive to find experiences that feature positive high intensity moments and that end well, and avoid experiences that have negative high intensity moment and the end badly, with little regard to duration.

Who Guides Our Decisions
According to Daniel, the remembering self tends to influence our decisions much more strongly than the experiencing self. The characteristics of our remembering self creates a bias in favor of goods and experiences that are initially exciting, even if they eventually lose their appeal. Considerations of time and duration are neglected, causing experiences that will retain their value in the long term to be appreciated less than they deserve.

Back to Emote
Emote was largely an attempt to focus my attention on the world as experienced by my experiencing self. The tools and processes that I created centered around tuning-in to, and quantifying the duration of my emotional states. My efforts were based on the assumption that by uncovering insights regarding factors that contributed positively and negatively to my moment-by-moment experience I would be able to make a positive impact on my pursuit of fulfillment. Now I realize that this was a flawed approach because the pursuit of fulfillment is largely a concern of the remembering self – it is the pursuit that is based in personal stories and a search for meaning.

I am still interested in exploring ways to focus more attention on our moment-by-moment experiences of the world, and to loosen the grip that the stories from our remembering selves have on our assessments and decisions – though I have no ideas or designs at the moment for a project to explore this area of opportunity. I also agree with Kahneman’s point that we need to take into account both of our selves in our pursuit of wellness for ourselves and our society.

Caveats and Credits
Quick caveat, this is my understanding of Kahneman’s theory coupled with a bunch of my own murky thoughts. I am sure I am over simplifying and misconstruing several of the ideas that he make so eloquently in his book. So take this with a grain of salt and read his book.


Emote [Thesis Presentation]

Saturday, May 21st, 2011

Here is the presentation for my thesis project Emote. It was delivered on Monday, May 9th. Below the video is a slideshow featuring from my presentation slides.

Emote [Thesis Presentation] from Julio Terra on Vimeo.

Here are the slides:


Emote [Thesis Paper]

Saturday, May 21st, 2011

Phase 2.0 of Emote is officially done since I delivered my thesis presentation last week, and finished my thesis paper earlier this week. I am very happy with how Emote continues to evolve. The thesis paper and presentation provided a source of motivation, and a great opportunity to document my work. Here are two links to download the pdf of my thesis paper [High-res pdf | Low-res pdf]. Below is the abstract of the thesis paper. The presentation is available on another post here.

Abstract: Emote is a platform that supports my practices for nurturing emotional attentiveness. Emote consists of five components: an emotion journal to keep me tuned into my emotions; an activity log to help me track activities related to my practice; the Emote wristband to monitor my physiological state; a database to integrate all this data; and an application to visualize the data in search for patterns and insights.

Emote was designed to help me: bring awareness to my emotional processes, identify my emotional triggers and scripts, distinguish constructive and harmful emotions, and nurture constructive emotions.

Emote has succeeded in supporting my practice across all the areas listed above. However, not all Emote’s components contributed to this success. The emotion journal and activity log have supported and elevated my emotional attentiveness practice. On the other hand, the wristband, database and visualization application still require further work.


Learning to Make a PCB

Wednesday, April 20th, 2011

Over the past few weeks I have been working on creating my first PCB (printed circuit board) for a project called Emote. The process has been extremely rewarding, though at times also frustrating. The potential offered by using PCB is pretty amazing. It makes electronics much smaller and opens up numerous design possibilities.

There are many reasons why I wanted to design my own PCB: there are good free PCB design tools available, numerous high-quality tutorials for these tools exist online, and the cost of fabricating a PCB is pretty reasonable.

Here is a high-level tutorial with an overview of the entire process with links to the various resources so that you can learn how to make your own PCB.

Getting Set Up
The first step is getting set-up by downloading and installing a PCB design application along with the appropriate libraries and extensions. Here is a list of the applications I used, along with associated libraries and extensions.

Eagle CAD [link to site]. This is the design software where you can create schematics and boards. I found this application felt counter-intuitive at first since I am accustomed to Adobe creative suite type tools. After about 10/15 hours of use I got used to it and start to enjoy the work. What I am trying to say is that there is a learning curve but don’t get discouraged.

Sparkfun Eagle Library [link to download].  this library is extremely useful, especially if you buy stuff from Sparkfun. It also contains a large number of components that they use frequently on their own boards. If you plan to use any of the tutorials I list below then this is a must. Here is Sparkfun’s own description and instructions: “This is the collection of all the components SparkFun designs with and therefore components and footprints that have been tested. Unzip and place the SparkFun.lbr file into the Eagle\lbr directory. If the above link does not work, google ‘sparkfun eagle library’ to get the latest collection.”

Sparkfun Eagle CAM File [link to download]. This file should be used along with the Sparkfun library. Therefore, if you download the library file make sure to get this one as well. Here is their own description and instructions: “This file is responsible for creating the gerber files for submission to a PCB fab house. Place this file in the Eagle\cam directory.”

Sparkfun Eagle Shorcuts [link to download]. The shorcuts in this file are used in the tutorials that I reference below. I did not use them but you should make up your own mind on whether or not to download the file. “Place this file in the Eagle\scr directory.”

Though I only used the Sparkfun library in my project, there are several other libraries out there that you should know about. Here are the other two most popular libraries:
Lady Ada’s Eagle Library [link to webpage]
Microbuilder’s Eagle Library [link to webpage]

Designing a PCB
Once you have installed the software and placed the library and extension files in their proper locations, you are ready to start up Eagle. As I next step I recommend that you go through the Sparkfun tutorials, which are easy to follow and comprehensive.

Eagle, an Overview. Before you jump into the Sparkfun tutorials here is a quick overview of how Eagle is set up. This high-level overview will provide some context for the tutorial links featured below.

Eagle handles the design of components and boards separately. Components are always created within libraries. In order to create a new component you need to create a new library, or add the component to an existing library. Boards are created within projects. It is important to note that the term board is used to refer to one of the standard views within Eagle.

Components and projects all contain two standard views: schematic view and the board view. The schematic view provides abstracted information about a component or board. Component schematics are comprised of pins, outlines and labels; board schematics are composed of components, electrical links between components, outlines and labels.

The board view provides an accurate model of the physical features of component or board. For a component the board view includes the physical location of the pads for each pin, the overall dimension, and any other important markers. For a board this view features the physical location of each component along with wire connections, drill holes, plates, and all other relevant specifications for fabrication.

When designing a component or project in Eagle you always create the schematic first. The schematic feature is actually a great planning tool for any DIY electronics project. Schematics are created first because they enable you to plan your circuit conceptually before working on the physical design.

Sparkfun Tutorials. Now that you’ve got a rough understanding of how Eagle works here are some tutorial from Sparkfun that will walk you step by step through the process of creating a schematic, design a PCB layout, and making a custom part for your project. I’ve also included a link to a tutorial where they share common issues they’ve encountered in during their many years of experience.

PCB Fabrication
Once you have gone through the tutorials above and finished designing your first board you will be in an excited hurry to get your board printed. You are almost there but be patient. Before I even get into the next steps I recommend that you do the following checks on your board:

  • Check all electrical connections to make sure there are no incorrect overlaps. It will cost you money and time if you don’t catch it before you fabricate.
  • Print out your board design on a piece of paper and check to make sure that the size of all components are correct, especially if you are using untested component designs.

Geber Files. To fabricate a board you need to generate Gerber files. The process for generating these files is covered in the Sparkfun tutorial about designing a PCB. These files are the industry standard for PCB fabrication. For a 2-layer board Eagle will generate 7 files (assuming you are using the Sparkfun CAM file). All files need to be submitted to the fabrication house.

Here is the extension for each Gerber file along with a brief description of its function:

  • GTL: top copper layer, holds location of electrical connections (i.e. ‘wire’)
  • GTS: top soldermask layer, holds edges of solder mask that protects from bleed overs
  • GTO: top silkscreen layer, holds text and lines that will be printed onto board
  • GBL: bottom copper layer, same as top layer but for bottom of board
  • GBS: bottom soldermask layer, same as top layer but for bottom
  • GBO: bottom silkscreen layer, same as top layer but for bottom
  • TXT: drill file, holds the location of all drill holes on the board

It is important to review your Gerber files before you send them out for production. Unfortunately, I have not found any good free Gerber file readers for Macs. There is a an online Gerber viewer available at http://circuitpeople.com/, unfortunately, it does not let you view multiple Gerber files layered on top of one another. If you know of a good Mac Gerber file reader please leave a link in the comments section.

There is a great online service at FreeDFM.com [http://freedfm.com] that processes Gerber files and provides feedback regarding issues, and even fixes some of the problems automatically. This service is provided by the guys at Advanced Circuits.

Fabrication Options. There are many different PCB fabrication houses around the country (and beyond our borders). Below I’ve included the two that have been recommended to me. Another option is to create your own PCB at home. Here is a link to a tutorial that shows you how to do just that. A few friends of mine have used this approach successfully.

First and foremost, Advanced Circuits [http://www.4pcb.com/]. This is where I will be sending my first PCB for production later this week. This place was recommended to me by Paul Rothman, one of the residents at ITP. They provide great deals for students ($33 full featured 2-layer prototypes), and they also offer really fast turnarounds on their bare bones boards.

Next up is BatchPCB. I have not used them either. I have listed them here because they seem to have an interesting business model and I have seem a lot of stuff about them on Sparkfun (I will admit much of it was their advertising).


Emote [Thesis]

Wednesday, April 13th, 2011

Its taken me a really long time to write this, my first post about my thesis. I decided on my current direction in late February. During the last month and a half I have been focusing my efforts on data collection and setting up the database back-end for this project. Without further delay here is an overview of the project. I plan on providing some updates over the next couple of days, stay tunned.

My Thesis in One Sentence: Emote is a platform for tracking my emotions and creating high-resolution visualizations that explore sources of meaning and fulfillment in my life.

Slightly More Informative Description: Emotions are important. They strongly affect how we experience reality by influencing our perception, choices and actions. Emotions are also messy and hard to measure. They are highly subjective short-lived phenomena created by interactions between physiological and cognitive processes in response to our psychological, social and physical context.

Emote explores methodologies for collecting and integrating high-resolution data about the physiological states, cognitive processes, and contextual factors associated with my emotions. To uncover insights from this data, Emote uses visualization approaches to highlight correlations between my emotions and activities, people, and places in my life.


Regular Expressions: Patterns & Rules

Tuesday, January 11th, 2011

Image from AJ Brustein, used under CC license

Over the past week I’ve become acquainted with Regular Expressions, also known as regex by those familiar with this powerful tool. I have only begun to scratch the surface but I am already blown away by the power of this text parsing engine. Here I will provide a brief tutorial of how to use this tool with Processing, and share some useful resources for anyone who is interested in learning more about it.

What is Regex?
According to wikipedia’s definition regex is a formal language that “provides a concise and flexible means for matching strings of text, such as particular characters, words, or patterns of characters… A regular expression can be interpreted by a regular expression processor, a program that either serves as a parser generator or examines text and identifies parts that match the provided specification.

In other words, regex provides a set of rules that enables users to search for and extract pieces of text from a larger text source (be it a file or a data variable). I used regex to extract specific bits of information – such as dates, titles and body copy - from over 300 posts to one of my blogs.

Regex has been incorporated into many different computer languages. Php, ruby, c++, python, and java all offer regex in slightly different flavors. For my project I used the regex functionality that is available in Processing.

Learning Regex
Now I’ll dive into a few technical examples to show you how it works. Here is a link to a good quick start guide that helped me get started, the information I will cover below all comes from this site. The way regex works is that you define a search pattern that i used by the regex engine as a query. Here is an overview of the most common elements of a regex pattern:

  • Literal Characters: these are the simplest type of regex patterns, they provide a straightforward match. For example, the pattern “cat” will match any instance of “cat” whether standalone or in words such as “catalog”, and “catastrophe”.
  • Character Classes: these patterns will match only one out of several characters. Character classes are identified by square parentheses. For example, the pattern “[bc]at” will match any instance of “cat” or “bat” whether standalone or in words such as “battery”, and “catalog”.
  • Shorthand Character Classes: these characters will match one character of a specified type, such as digits “\d”, word characters “\w”, or space characters “\s”. Using a capital letter, such as “\D”, creates a negated match that matches any character that is not of the specified type.
  • Non-Printable Characters: these characters enable you to add non-printable characters to your regex pattern. For example, “\n” specifies a new line feed while “\r” identifies a carriage return. Regex also supports the syntax “\xFF” to use a hexadecimal numbers to specify an ascii character.
  • The Caret: creates negated matches when used within square brackets. Similar to the way that capital letters function for the shorthand character classes, the caret allows the specification of characters that should not be matched. For example, “[^b]at” will match sequences that contain the characters “at” preceded by any character other than “b”.
  • The Dot: matches any character except line breaks. For example, the pattern “.at” will match any instance of “at” that is preceded by a non-line break character. It is important to note that this pattern will not match “at” all by itself.  The dot is very powerful and can easily cause unwanted matches if not used sparingly.
  • Anchors: these characters specify a location within a text source. The “^” specifies the beginning of a text source or line of text (depending on the regex mode), while “$” specifies the end of a text source or line of text. You can also use the “^” and “$” characters to only match characters that are placed at the beginning or end of words.
  • Alternations: the “|” character provides the equivalent of a logical “or” functionality. It is used to identify multiple different potential patterns for  a match. For example, “bat|cat” will match either bat or cat.
  • Optional: enables you to specify optional characters for a given match. For example, “c?at” has an optional “c” so it will match the pattern “at” or “cat” whether standalone or within a bigger word.
  • Repetition: the “*” and “+” characters can be used to enable a patterns to be matched repeatedly. They differ in that “*” matches a character 0 or many times, while the “+” character matches 1 or many times. For example, the pattern “ca*t” will match “ct”, “cat”, or “caat” while the pattern “ca+t” will match “cat” or “caat” but not “ct”.
  • Greedy and Lazy Patterns: the difference between greedy or lazy patterns is that greedy patterns (which are the standard) will expand the match as much as possible, while lazy patterns will keep the match as short as possible. Using the “?” character will make a match lazy. For example, for the string “cathat” the pattern “c[a-z]+t” will match “cathat” while the pattern ”c[a-z]+?t” will only match “cat”.
  • Groups: by using “()” we can specify groups. This is useful because we can apply quantifiers to these groups, and many regex-based functions are able to return the values of these groups separately (this was very useful for my sketch).
  • Lookarounds: these patterns are similar to anchors in that they enable you to find a position that is defined by a pattern without including this pattern in the overall match. For example, “(?=c)at” will match the “at” in cat but it will not match a standalone “at” because it looks for the “c” as an anchor.

Here is a link to the Regex Pal, this is a great tool to test out regex patterns.

Regex and Processing
Regex can be used with several functions within Processing. The two functions that I used to extract and clean text from the web page were:

string.matchAll(“regex”): this function takes the regex pattern and returns a two-dimensional array with the matching patterns and sub-patterns. For example, if applied to the text “cat cat hat” the pattern “[ch](at)” would return the following array: [0][0] “cat” and [0][1]“at”, [1][0] “cat” and [1][1]“at”, [2][0] “hat” and [2][1]“at”.

string.replace(“regex old text”, “new text”): this function takes a regex pattern as a first argument, and replaces the instances of this pattern with the new text provided as a second argument.

[image taken from AJ Brustein, used under CC license]


MoodyJULIO: Mood Tracker Technical Details

Thursday, January 6th, 2011

Now that the semester is over I finally have time to document the hardware and software solution that I designed to capture data for my MoodyJulio project. This covers only the first phase of this project as I am still working on designing the data processing and visualization components.

Before I dive into the weeds let me quickly recap the inspiration for this project. Over the last year I have become more conscious regarding my health (both mental and physical). This project was designed to provide me insights regarding how different activities, people, and places affect my emotions and moods. Here are links to several journal entries that provide a conceptual overview of this project.

  1. Investigating My Physiological and Emotional States
  2. Excited, Emotional and Moody

Designing the Mood Tracker
Designing the mood tracker involved more than just finding a solution to fit all components into a wearable package (though that was a big challenge), it also required figuring out the process for interacting with the user in explicit and non-explicit ways. The design of these interactions had an impact in the physical and virtual (code) aspects of this project.

You may be wondering what the hell do I mean by explicit and non-explicit interactions. The explicit interactions included capturing data from the user regarding their emotions, and enabling the user to easily transfer data between the device and their computer or the internet. The non-explicit interactions refers to the data captured by the various sensors. I refer to these as non-explicit because my goal was for the user to forget that these sensor readings were even taking place so that I could minimize the impact of these interactions on user behavior.

To arrive at the final process, which is outlined in my previous journal entry (Investigating My Physiological and Emotional States), I tested several different approaches. My goal was to achieve an objective-subjectivity, which is the most I could hope for since my focus was on emotion tracking, considering that emotions are such a subjective phenomena.

From this testing I came up with a few important process improvements that helped me get closer to the objectivity that I was looking for:

  1. To ensure that I was capturing emotional data in an objective manner I ultimately made the mood tracker responsible for identifying the times of day when I needed to record how I was feeling. During the testing phase I would record this type of information whenever I felt like it. This tended to skew the data because I would only input data when I was feeling strong positive or negative emotions – rarely when I was bored or focused on an activity.
  2. To help me analyze how I was feeling I created a long list of positive and negative emotions that I used for my mood journal posts. I know how hard it is to reduce what we feel to a single emotion (especially now that I have been doing this for several weeks);  I often feel multiple and even conflicting emotions. Nonetheless, after much thought I decided that if I did not take this approach the data would be too qualitative/subjective and therefore hard to use in the analysis phase. I plan to re-examine this choice when in future mood tracking efforts.

The Physical Build
Over the length of the semester I created three (3) different prototypes before settling on the design for the final version of the mood tracker. Here is a list of the components I used for the final tracker followed by a brief  overview of the function of the most important ones. I’ve also included pictures of the most important elements.

  1. Arduino Lilypad: the Lilypad functions as the main microcontroller for this device. All other components are controlled by this nifty little piece of technology. This Arduino is responsible for capturing data from all sensors, storing the data on the SD card, identifying when to page the user for input regarding the emotions and moods, and then capturing the input via the two push buttons.
  2. Heart Rate Monitor Interface: the Heart Rate Monitor Interface connects to the Polar Heart Rate transmitter, together these devices capture the user’s heart rate and provide that data to the Lilypad. The Heart Rate Monitor also features four (4) input pins. I used three of these pins to attach the accelerometer to this device, rather than linking it directly to the Arduino. This component is connected to the Arduino using I2C protocol.
  3. GPS Module: the GPS module is only used here for time keeping. This project did not include a location-based element (though I considered it). However, since I was capturing data via multiple different platforms it was crucial to have all input time stamped so that it could be matched together during the analysis and visualization phase (by itself the Arduino can only provide milliseconds from startup). This component is connected to the Arduino using serial protocol.
  4. microSD Card Logger: the OpenLog logger enables the Lilypad to log the data it captures from the sensors. This logger uses a micro SD card (the type commonly used in many cell phones) to log information. It is easy to set-up and records all data that the Arduino sends out via the serial port.
  5. Electrodes & 10K Resistor: the electrodes and resistor were used for sensing Galvanic Skin Response (GSR). GSR can be sensed by using two electrodes to measure conductivity changes in our skin. The best place to measure GSR is your fingers. I won’t get into details here about GSR sensing, here are three links that will give you some guidance: (1)Che-Wei Wang’s Arduino and Processing code for sensing and graphing GSR; (2) Mustafa Bagdatli’s overview of different materials used to sense GSR; (3) Sean Montegomery’s Truth Wristband that uses GSR technology.
  6. Vibration Motor, Momentary Push Buttons & LED: This motor and LED are used to notify/page the user whenever it is time to post data about emotions. When input is requested, the push buttons can be used by the user to input whether they are feeling positive or negative. Notes regarding the motor: since the motor uses up a lot of power I made sure to minimized its use. I used a small motor that was different from this one but I decided to feature this one as I think it would have worked even better.
  7. 3-Axis Accelerometer: the accelerometer I used for this project was an older version of this component. The purpose of this element is to measure how much physical activity is taking place. I decided to connect this component to the Heart Rate Monitor Interface (rather than directly to the Arduino). Unfortunately, my accelerometer (which was already quiet old) stopped working a few days after I finished assembling the mood tracker – so I ended up not using it.
  8. Icebreaker Sours Box: this is the box for one of my favorite candies (Icebreaker Sours – Berries). It also happens to be a good little enclosure for projects such as this one.

Writing the Code
It took me a long time and many iterations to write the code for this project. This was by far the most complicated Arduino project that I have done to date. On a positive note I really enjoyed learning how to take my wiring skills to the next level (and this practice came in really handy for my NIME project). Here is a link to the current version of the code, I won’t post the code here because I don’t want this blog post to become too long.

In the end I structured my code primarily based on functionality. I did not use object oriented coding methods as much as I would have liked to – this was in large part due to the fact that I am still learning how to use object oriented programming with Arduino’s wiring code platform. So when I say that I structure the code based on functionality, I mean that I distributed the functions that perform different functionalities into separate tabs (or .pde files). Here is the breakdown:

  1. Mood Mapping Main Functions: this is where you will find wiring’s core setup and loop functions, along with the high-level definition of the gpsConnect class (the only object-oriented class in this sketch).
  2. GPS Class Functions: features all the functionality associated to connecting and communicating with the GPS unit. I used the NewSoftSerial library here since I needed to use the standard serial port to connect to the data logger.
  3. Read Data Functions: here you will find all functions responsible for reading data from the various sensors.
  4. Serial Output Functions: this is where you will find functions that print data to the serial port for logging.
  5. Physical Output Functions: features functions that govern the physical output via vibration motor and LED.
  6. Wire HRMI Functions: includes functions that use the wire library to communicate with the Heart Rate Monitor Interface, which is connected to the Arduino via I2C.
  7. Sketch Notes: here you will find exactly what you would expect, notes about this sketch.
  8. Deprecated Functions: this is where you will find functions that are no longer in use.

MoodyJULIO: Investigating My Physiological and Emotional States

Friday, November 19th, 2010

For the last few months I have been working on designing a system that enables me to gain insights regarding my moods and emotions. As I outlined in my previous journal entries, this system needs to enable me to monitor my physiological and psychological behavior, and enable me to cross-reference this data with contextual information about my activities. This past week, after a long testing and preparation phase, I finally launched this project and gave it an appropriate title: MoodyJulio.

Today I will share the completed project proposal that I presented to my colleagues earlier this week, followed by detailed information about the final design of the methodology for my data capture and monitoring system. I will also provide an overview of the next step in this project, comprised primarily of data analysis and visualization work. Lastly, I will share some interesting questions that have been raised by the work that I have done so far (several of which are not directly addressed by this project). Enough preamble, let’s get started with my final project proposal:

MoodyJULIO Project Proposal

Tracking and Measurement Methodology The design goal for the methodology that I created was to devise a system that supports “objective subjectivity”. In other words, I wanted to create an objective way to capture my subjective experience of moods and emotions. At the same time, I had no desire to apply the same level of rigor that medical or scientific studies. The system that I created has 3 core components, and 2 fun components. Let’s start with an overview of the core components:

  1. Biofeedback Component: the first component encompasses the processes and tools for measuring my physiological response to the world. My focus here is on collecting heart rate and galvanic skin response data, two variables that correlate strongly with physiological levels of excitement. To collect this data I use two small devices that I wear throughout the day: a Polar heart rate transmitter and a home made device (dubbed MyMoody) that captures gsr readings and logs that data along with the heart rate information and a timestamp.
  2. Self-Analysis Component: the second component is comprised of the processes and tools for measuring my emotional state. This is the main area where subjectivity comes into play, so I spent a good amount of time thinking about and testing different approaches. The solution I devised leverages the MyMoody device and a smartphone to enable objective-subjective tracking of my emotional states. The MyMoody device prompts me every 40 to 80 minutes, or whenever my heart rate exceeds 95 beats per minute, to log my emotional state. When prompted I submit a short post about my emotional state using my smartphone.
  3. Contextual Component: the last core component is focused on enabling me to capture information about the context associated to my physiological and emotional states. The tools associated to this component are a smartphone and/or a computer. When posting updates about my emotional state via my smartphone I will also record my current activity and note who I am with at that moment. At the end of every day I will update my calendar with a detailed overview of my activities during that day.

Now here is a brief description of the two fun components:

  1. Social-Analysis Component: this component includes process and tools for capturing other people’s perspective regarding my moods and emotions. The process for capturing this input is supported by an old school paper business card (pictured in the presentation above), and posterous.com blog, and a twitter feed. Here is how the process works: when I talk to someone I request that they submit a report regarding my mood. I give them a card that reminds them of this request and provides an email address where they should submit this information. When an entry is received it is added to the posterous blog and the twitter feed.
  2. Picture-Analysis Component: this component aims to provide insights into my moods by capturing images of me when I am using my computer. The process is quite simple, whenever I am using my computer it takes a picture of me every minute using the webcam. These pictures are then stored using sequential files names.

Framework for Describing Emotions In order achieve my goal of objective subjectivity it was important to leverage a framework that enables me to communicate about my emotions in a consistent manner. Rather than attempt to re-invent the wheel, I decided to look for an existing framework that I could leverage. For a while I was leaning towards using Robert Plutchik’s wheel of emotion. However, I ended up deciding to use the HUMAINE project’s Emotion Annotation and Representation Language (EARL) because I find its structure aligns better with my data collection approach and the way I think about emotions. Interestingly, this framework was developed to support emotion-oriented computing. Interesting Questions During this initial process of data collection several questions have been percolating in my mind. These questions are not directly related to my current project but are definitely worth further thought and investigation – who knows, one of these questions may serve as a seed for my thesis project.

  • How does the process of categorizing (or rationalizing) our body’s physiological responses into emotion categories impact our experience of life and the world around us. In some ways this is a process of mediation. How does this process affect our relationship with our own mind and body.
  • How can we experience the world more directly without the interference of this mediation process? Is it desirable to seeks this more direct experience of the world?
  • How does this type of research affect how we experience our emotions. Since emotions are constructs that are largely determined by the stories that we create to rationalize the physiological responses of our body, how does this focus on emotions impact this process (for better or worse).

Excited, Emotional and Moody

Tuesday, October 26th, 2010

Since the beginning of the semester I’ve had a one track mind in regards to the Rest of You class. I’ve wanted to find a way to track my level of excitement, my emotions and moods so that I could gain insights about what makes me happy (or better yet, how I can lead a fulfilling life). This direction stems from my long-term interest to understand myself better and specifically to understand my relationship to the world around me. I strongly believe that I am responsible for my emotions; this is not to say that I can control the way my body responds to the world.

Emotions encompass more than just the physiological responses of my body, they also entail the conscious process the classify our physiological responses into emotion categories. I strongly believe that I have the power to be the conscious writer of the stories that my brain creates and that define my conscious experience of the world, and my emotional response to these experiences.

With all of this in mind, to succeed in my project I felt that it was important for me to find ways to capture three different types of data: 1) information about my physiological level of excitement; 2) contextual data about what I was up throughout this research (e.g. my activities and locations); and, 3) personal thoughts and reactions related to what’s going on in my life. From conversations with others during the last couple of days I am also considering adding a fourth data dimension to this project: capturing input from people who interact with me regarding my current mood.

So here is how I plan to capture all of this data:

  1. Physiological data: I’ve built a small portable biometric sensing device (pictured above) that will capture my heart rate and galvanic skin response. This little device will travel with me 10 – 12 hours a day. Whenever my vital signs go above a pre-defined threshold the light on this device will flash to request for me to categorize my excitement is either a positive or negative experience.
  2. Contextual data: I will use two separate processes to capture contextual data. First, the portable device that I have created will feature a GPS component that will enable tracking of my location and time of day. Second, every night I will update a calendar with detailed information about my activities for that day.
  3. Personal perspectives: to capture my feelings, emotions and moods I will use two main processes. First, the buttons on the portable biometric device will capture high-level positive/negative data regarding my emotions. The more informative means of capturing data will be via twitter feeds. Every time that I make a positive/negative input on the biometric device I will also send a tweet with a brief note regarding my current mood.
  4. Perspectives from others: to capture data from others regarding my emotional states and mood I will request that people send tweets to a specific hash tag with their input (or I may set-up a text message inbox – any suggestions of a good service where I can set this up, please let me know).

One last potential data set that is ripe for analysis is my emails and blog posts. I have not yet decided whether or not I will add this lens to the project. The good thing about this data is that I don’t have to proactively capture it, I can decide to analyze it on the back end.

As I capture data over the next month I will work to develop visualizations that will help me gain insights from all of this information. There are a couple of perspectives I know I want to explore: 1) my mood based activity type (particular classes at ITP, time spent with specific people, time spent doing yoga, time eating or drinking – better yet drunk); 2) my mood based on time of day or week (when am I most happy); 3) my mood graphed geographically (what places excite me?); 4) my perspective versus the perspective of those around me.

Here are some relevant links to pictures and videos related to this project:

1) geo-based visualization prototype of my GSR readings during a 2-hour bike ride
2) treemap visualization based on key-logging done over an entire week
3) pictures of the prototypes of my portable bio-metric device


Geo Mood Map: Data Visualization Prototype

Saturday, October 23rd, 2010

I have long been interested in the idea of doing a mood mapping projects for personal understanding and growth, quality of life improvements and a search for fulfillment. These are lofty goals and big expectations, I know. I am aware that my project probably won’t amount to much in these areas but in the least I expect to be able to glean some simple insights regarding what makes me happy.

Since the beginning of this semester my focus at ITP has shifted towards the exploration of personal wellness. For a long time I have been interested in helping myself and others create possibilities and fulfillment in their lives. Since last year many seeds have been planted that are now enabling me to make this shift. Here is a brief overview of the various “fertilizers” that have helped create this budding possibility:

  • Over the last year I have had a personal focus on my physical and mental health. Health was actually my theme for the year. As part of this initiative I am practicing yoga and meditation. I have also overcome an addiction that was acting as impediment to my personal growth.
  • During this time I have also developed a new relationship with food. This was largely driven by the need to start cooking to save money, but has now blossomed into my involvement in local food communities such as my neighborhood CSA.
  • During the past year I’ve also had the opportunity to listen to and participate in many inspiring conversations. Two, amongst the many, people who have contributed to my life in this area are my mom (aka my yogi) and Linda Stone (who has some very interesting notions about post-productivity computing – check it out here).

One thing that has become clear to me over the past several years is that life cannot be reduced to language. I know this sounds like an obvious claim. However, living in our mind-obsessed society I have noticed that I have become accustomed to focusing (better yet, limiting) my experience of the world around to what can be grasped by my mind and described in language. I know that my mind does not have the bandwidth to enable me to experience the true richness of the world all by itself. The models of the world that reside in my mind and the language which acts as a filter on my experience arbitrarily cuts up the real world into small dead symbols that lack the beauty of the reality to which they point.

Ok, now I am getting preachy and rather esoteric. To bring it back down to earth, I just want to say that it is only through our body and its intelligence that we can more fully experience life. So now bringing it back to my current pursuits, I want to work on developing technologies that are designed for human beings, not human minds.

I’ll keep today’s journal entry focused on the big picture. I’ll share more about the specifics of how I plan to achieve this in the next couple of days. In the meantime, you can download the code for this patch from github. I’ve already created a few prototypes so progress is happening at a steady clip.