A quick google search yielded the a schematic designed by KK6GIP. A good place to start.
A quick dig around in the junk box yielded the necessary components, a piece of recycled project board, an old busted iPhone headset, and the cheap headset that came with the radio. A few minutes later after some wire trimming and soldering – Voila! – an iPhone to BaoFeng audio interface.
I tuned the radio into 144.390 MHz (a common 2m APRS frequency) and transmitted an APRS data burst. This resulted in my coordinates and message being received by K2PUT (a local digipeater) which was forwarded to an internet gateway.
Ok that was fun. What next?
When the International Space Station isn’t using its ham radio for voice, it operates as a FM Packet Radio Digipeater on 145.825 MHz.
So I connected my homemade 2m yagi antenna, tuned to 145.825 and set my digipeater path to ‘VIA ARISS’.
Now all I had to do was wait for the ISS to pass overhead. The ISS completes around 15 orbits a day, so this didn’t take too long.
Using an iPad application called ProSatHD, I was able to determine down to the minute when and where the station would pass overhead.
I created a new ruby plugin that is used by plamoni’s “Siri Proxy”, a proxy server for Apple’s Siri assistant. This proxy server allows for the creation of custom plugins that can intercept recognized speech and perform virtually any function imaginable (programmable, scriptable).
The “Siri Proxy” plugin I wrote handles interaction with a php script that runs on my web server. The php script, which I developed months ago for personal use, allows me to send commands to my car which has a Viper SmartStart module installed.
Current commands accepted are: “Vehicle Arm”, “Vehicle Disarm”, “Vehicle Start”, “Vehicle Stop”, “Vehicle Pop Trunk”, and “Vehicle Panic”.
–UPDATE: Now it also responds to more conversational commands such as “Start my car”, “Lock my car”, “Pop my trunk”, etc…
I may change the command wording a bit later. This was just a proof of concept. We’ll see.
Siri Proxy & DNSMasq box – Ubuntu 11.04 Server VM
Quest Visual released their new iPhone app this morning, Word Lens.
I saw a post on Google Reader this morning and decided to download. After buzzing around, watching it perform the word reverse demo, I purchased the in-app English-Spanish translation packs. That is when the fun began.
I was completely blown away by how quick and precise it could replace words on signs. The words looked like they should be there in real life. It is definitely an extraordinary accomplishment.
I cannot wait for more language packs to be released.
Can you imagine a heads-up display, built in to your sunglasses, automatically translating and augmenting your reality as you travel to a foreign country? A truly mind blowing concept.
Wow… So a prototype of the Next iPhone was lost/stolen?? and sold to Gizmodo for $5000.00. I really feel bad for the poor guy who lost this thing. Is this the final prototype? Who knows.. Apple confirmed the authenticity of this prototype when they called and then sent a formal request letter to Gizmodo’s Editorial Director Brian Lam for the return. I’m eager to see if/what legal actions Apple pursues against the finder/thief?? and the buyer, Gizmodo.