For my final project, I was (am still?) keen on creating an mobile app that will work towards minimizing waste. My big plan of Foodprint is still a work in process, and an app would only be the first step. I could work on developing learning/recognition algorithms for buying/selecting behavior, and it would hopefully be a very valuable and productive experience.
But, we’re in ITP. ‘I’ stands for Interactive. Not that mobile apps aren’t interactive, but that’s a lot more closely related to mobile development 101, and less related to why I’m here: learn the basics of certain technologies (in this case, making apps), and then screw around with them and try something new.
Since we’ve spent the last few weeks working with sensors outside of the phone, I wanted to try to find a way to make these work with the phone in a new way. There is so many options, but why use these sensors, when our phone contains many of them within it?
These are questions I’m still thinking about, but I know I want to use a sensor, and make that work with the app. Since we’re using phones, it’s got to be portable, otherwise, why is it on a phone? The presentation last night of Google Cardboard and the augmented/virtual realities got me thinking about that as well, and if I could combine them all, I could have an interesting experience (granted that it’s an interesting idea).
I looked around for existing apps with Google Cardboard, and was surprised that, despite the whole for the camera, very few actually use the camera. The image above is from Glitcher VR, one of the few that does, and it’s a starting point for what I want to achieve. They just use the camera and Google Cardboard, and you can view your surroundings in different ways (like 8mm film, like a VCR player, with inverse color, with face tracking). The thing is, at this point, there’s nothing beyond that. But what if we made some kind of game out of it? Could two people be in this augmented/effected world? Could a TI Sensor tag in their hands become a tool/weapon of some kind? Should they work together, or be against each other?
I can’t say I have answers to these things, but I know the steps I’d like to achieve, and hopefully as creativity strikes, a direction for this game/application would be more clear:
- Make an app made for Google Cardboard using the camera on the phone as the users new set of eyes.
- Use p5.js inside Phonegap to make the viewing surface a canvas that can be drawn on. Effect said canvas, and try to make animations within it.
- Utilize a TI sensor tag as a hand tool. The sensors in conjunction with the ones on the phone can clarify if they are oriented in the same direction. Make the button do something.
- See where this process leads.
I’m not a big fan of ‘shoot ‘em ups’, though I friggin’ loved laser tag as a kid, but this would be one obvious path. To try to stay away with the theme of guns, I thought of making it a silly string match, where your TI sensortag is your bottle of silly string and you have to get the other person more covered in it (sort of like shooting…just more fun!). If I could find a functional or better reason for these things, I am very motivated to switch that aspect of it, but I want to get sensors talking together in a new and fun and interactive way, and I feel like this is a good starting point.
Fun, new augmented reality apps are starting to come out, but I do think a next step for some of those (i.e. Google Skymap and other ones for seeing the stars), would be a much more immersive experience if we didn’t have to obviously look through our phone. This is obviously still looking through our phone, but it almost literally becomes our eyes.