The Iron Girl Project - Computerized Exoskeleton System

Lexikitty

Active Member
Howdy all!

So I've been lurking here and there on the RPF since Novemberish, and I'm finally getting around to posting something. This is my first thread here, so I'm sorry if I make any newbie mistakes. Allow me to introduce....

The Iron Girl Project

The goal behind this project is to bring as many of the technical systems of the Iron Man exoskeletons into real life. Simple. Sort of.

The idea is that a centralized computer (or set of computers) can pull data, issue commands to various mechanical systems, and give you real-time info on your surroundings. For instance, I made a handheld repulsor module recently using a Trinket (tiny development board) and used I2C to report back to a Raspberry Pi that it had been fired.


In this method, I used a poll/response system, but I ended up having to change it to a "push" response, due to the audio delay. Either way, what this does is let the computer have conversations with all of the modules on the suit. The repulsor, for instance:


  1. RPi registers button press on hand controller
  2. RPi sends command to hand to run "hand flash" program.
  3. RPi plays repulsor sound via Python.
  4. Trinket reports back to RPi and says "Hi, I fired, and (whatever other information it can provide)
  5. RPi takes any relevant data and puts it up on the HUD.

It sounds (and is) kind of overkill. But where this gets really interesting is what data you can pull. For instance, laying wires into armor plates along the arm would let you sense a break, dent, or puncture along any piece of armor (given an I2C>Capacative board) and shown as red in a suit model on the HUD. Rotation of the head can be calculated into shoulder-mounted turret movement. If an arm missile assembly is used on the forearm, analog feedback servos could tell the system "The arm door is supposed to be at 90 degrees, but it's stuck at 15, so I'm disabling it for now".

As far as the HUD goes, I'm using the eyepieces of an Olympus FMD-700. I'm toying around with two different methods of creating the overlays, one is with Python, and the other uses Processing. Here's some shots of them below, and a video.

Olympus HMD w/Python overlay:
hud1.jpg
hud3.jpg

The Olympus unit I'm using (FMD-700), along with the RPi Camera Board:
hud2.jpg

And the second method, using Processing and a bit spiffier 3D overlay:
2014-05-25+12.37.56.jpg

And a video! Because videos.


So that's the basics. What I'd really like some input on are the following:
  • What other systems should be implemented? As it stands, I've got AHRS (Altitude Heading and Reference), temperature, repulsor systems, 16 channels of servo control, a shoulder turret (possibly), visor up/down, and turret control.
  • What other types of data should be collected? I'm still sifting through stills of the IM movies for ideas, but I'd love to hear feedback.
  • If you were to use something like this in conjunction with armor/a suit, what would you want it to do?

I'm planning on releasing all code, diagrams, and schematics for everything that I finish, once I've polished it up, as open source, and I plan (hopefully) to keep it as modular as possible, so it can be expanded or simplified to fit other mechanized armor systems (Pacific Rim and Transformers cosplays come to mind). The software and hardware won't be Iron Man-specific, that's just what I'm using as a chassis to keep my energy and motivation for this high. Oh, and if you have some experience in Python or Processing and you want to help out, I'm totally up for that. I managed to cram this project into my life using dark matter and an air compressor, so any extra help would be very much appreciated. Oh, and I'll keep all future updates to this thread, so I can keep just one running ticker of everything related to this.

Cheers!

~Lexikitty
 
Last edited by a moderator:

Stirex

New Member
Really awesome, was just thinking something up like this the other night using Google Glass... it's actually funny as soon as I start to think something through, I find it on this board :) guess great minds think alike. But I don't have the funding nor programming knowledge lol

Some ideas:

Head tracking/ targeting like the Apache Helicopter for a War Machine like gun.. also Include a Red border area for gun limitations on travel

Integrate a Gyro for servos to activate flaps / control surfaces. As the user moves the flaps move to "counter" the movements. Could have a "flight mode"

Temp sensors in head and torso area that can control fans for the user

Voice or micro switch to activate other servos

A run down of all systems that can be displayed in HUD... basically all components attached and their status, current operation mode, battery power etc..
 

Lexikitty

Active Member
Really awesome, was just thinking something up like this the other night using Google Glass... it's actually funny as soon as I start to think something through, I find it on this board :) guess great minds think alike. But I don't have the funding nor programming knowledge lol

Some ideas:

Head tracking/ targeting like the Apache Helicopter for a War Machine like gun.. also Include a Red border area for gun limitations on travel

Integrate a Gyro for servos to activate flaps / control surfaces. As the user moves the flaps move to "counter" the movements. Could have a "flight mode"

Temp sensors in head and torso area that can control fans for the user

Voice or micro switch to activate other servos

A run down of all systems that can be displayed in HUD... basically all components attached and their status, current operation mode, battery power etc..

I actually was really excited when Glass was announced, and was going to be part of the early developer program for it. And then I sifted through the SDK and what it could actually do and was very disappointed with it. It is, at best, an Android notification shade with a camera nailed to it.

I have a 10-DOF (degrees of freedom) I2C IMU right now attached to the camera, which will probably be mounted above the eyes of the visor in the middle of the forehead. This'll measure temperature, barometric pressure/altitude, 3 axes of gyro, and 3 axes of acceleration, along with a magnetometer (compass/heading). (You can actually see this information on the HUD video). I'll probably add another one in the body at some point to move the flaps on the body in the opposite direction as the heading. The gyro on the head would probably be able to assist the turret, but probably wouldn't give enough orientation data to control more than that (safely).

I really like the red border for turret limitations - didn't think of that. Kind of like the War Machine HUD from IM2 when Rhodes is chasing Stark.

The "menu" system will be mostly (hopefully) controlled by the jaw. There'll be a button in the inside of the chin, one to the left, and one to the right. This should allow one to move between "flight" and "attack" modes, and at least have an "OK/Menu" button. Might actually 3D print a prototype jaw just for that this weekend.

A run down of all systems is what I'm still working on. Battery level, damage, orientation data, but....I don't know what else to put up there. The HUDs from the movies actually have a lot of extraneous, pretty models flipping about and being showy, but very little of it is actual data (At least from the IM1 stills I've got).

Cheers,
~Lexikitty
 

Stirex

New Member
Ok when I was in the Army I was a scout and Bradley Commander. I know the turret system was integrated with the GPS system, this included the compass reading. There were two sensors, one in the hull and one in the turret. This allowed the vehicle to know which way it was pointed relative to the turret also with the GPS and Laser Range finder I could laze a target and it would return a 10 digit grid location based on this data. If the sensor in the helmet was sensitive enough it could possibly see the difference of the head heading, compared to the body heading, therefore allowing the turret to move only relative to the head. I know the axes and accel. gyros in my phone are sensitive enough that I can fly a quad copter with it. I would think that this would allow some nice targeting. I agree the HUD in the movie is cluttered with nonsense and could be minimized. If all that was shown in normal operation was a scrolling heading at the top of the view with an arrow indicating body or direction of travel and the head would be the main compass. Check out the HUD in MechWarrior Online, shows how legs and body can be marked. Otherwise a generic human outline with points on different limps, chest, head, front and back could be used to show the systems.

As I would have done it I would have a few modes that could be operated:
1. System check/ startup
a. Would show systems check
b. Simulated power up with bar graphs etc.
c. Software, axis, gyro calibration

2. Operation
a. General HUD
b. Headings
c. Battery
d. Minimized Body Outline
e. Power level (bar graph)
f. HUD would still show artificial horizon lines, and target box, but would be grayed out..

3. Combat
a. Weapon Systems with Ammo
b. Larger Body Diagram (show damage or something)
c. HUD targeting in green with cross hairs and weapons travel limitation box (Look at modern fighter jets gun targeting :))(Add a slight drift and delay when body and head are moving independently to give illusion of speed )
d. Counter measures display
e. Fake radar screen

4. Flight / Flight Combat
a. As above but include altimeter, airspeed, and activate flap/ control surface systems

Anyways this all would make for a lot of fun... especially if you are stuck at a Convention with nothing better to do then target people to squish them :)

Also integrate a camera that sees the what the user is looking at and possible have an output to entertain people :)
 

Stirex

New Member
Also just thinking, if you are worried about limitations on servos, I have some in my RC Cars that have +400 oz. of torque and transit time of .09 seconds... others I have, have +200 oz. of torque and .04-.05 transit times :) operating at 8.4v on LiPO.
 

Lexikitty

Active Member
That's exactly what I was looking for! Thanks! The magnetometer I'm using is definitely sensitive enough - a little too sensitive if anything. I'm going to sketch up the 4 different HUD's this weekend. Targeting might be a job for later on, as that'll involve some real horsepower with OpenCV (unless Processing can do more than I think it can), and loading up the turret with Nerf darts might be fun once I've got the trajectories down. But I can at least create the three operational modes and boot-up sequence.

Depending on which platform this ends up running on (either a Windows tablet or a Raspberry Pi), there'll be a little screen or an actual tablet on the back of my suit so folks can see all the madness going on inside the helmet. :)

Oh, and what model/brand of servos are you using for those higher-power jobs? I've been using micro metal-gear ones that do just fine for panels so far, but I'd like to have options for the larger flaps on the back and the weight of the turret. Currently my I2C servo controller only supports up to 6V, but I could make dedicated lines for just the turret/flaps.

Thanks again!
~Lexikitty
 

Stirex

New Member
ProTek R/C 130SS Standard Digital "Super Speed" Metal Gear Servo (High Voltage) [PTK-130SS] | Radios & Accessories - A Main Hobbies

ProTek R/C 150S Standard Digital "High Speed" Metal Gear Servo (High Voltage/Metal Case) [PTK-150S] | Radios & Accessories - A Main Hobbies

ProTek R/C 150T Standard Digital "High Torque" Metal Gear Servo (High Voltage/Metal Case) [PTK-150T] | Radios & Accessories - A Main Hobbies

Hitec HS-7966HB Digital High Speed Karbonite Gear Servo [HRC37966S] | Radios & Accessories - A Main Hobbies

Just some examples... there are several more that you can poke through till you find something that meets your needs

Savox SV-0235MG "Super Speed" Steel Gear Digital 1/5 Scale Servo (High Voltage) [SAV-SV-0235MG] | Radios & Accessories - A Main Hobbies (much larger then standard size, but could control gun mount)

These are the ones I use:

HFD (Hobby Force Distribution) (Red for torque / Green for general purpose(throttle/brake, or the Signature series for both)

I can say these have the highest resolution I've seen and are whisper quiet

If you have any questions about servos let me know, I have run most and can tell you which are worth your time and which ones aren't... some are more noisy then others, some are more durable... I've put quite a few through the ringer as I race 1/8 nitro / electric off road..
These are the ones I use:
 

Lexikitty

Active Member
Perfect, thanks. Torque will probably be the only real priority - control flap speed won't be a real issue unless I'm actually flying the thing. I should be okay as long as I can find an I2C controller for the higher-voltage servos, or just make a pigtail for the turret power and keep the signal wire on my existing 6V controller.

~LK
 

Murdoch

Sr Member
RPF PREMIUM MEMBER
Just a thought. You mention earlier about control with in the jaw movement. Have you seen the sensors that are like medical eletrodes that react to your movement. I've seen this with ardiuno, maybe it's available for RPi...GM

Also which armor are you looking to apply this to? GM
 

Lexikitty

Active Member
Just a thought. You mention earlier about control with in the jaw movement. Have you seen the sensors that are like medical eletrodes that react to your movement. I've seen this with ardiuno, maybe it's available for RPi...GM

Also which armor are you looking to apply this to? GM

Hm. Not sure what sensor you're referring to. I know of the "tongue mouse" contraption for people suffering from mobility problems, and some Googling just now found me a device for TMJ analysis and a jaw tracking/facial capture rig, but the former is a bit too large, and the latter requires a camera aimed at the face. Do you know any other details about it? Anything that works on Arduino I can get working on a RPi, more or less. I was planning on using three arcade-type lever switches mounted to the inside chin of the helmet, and just "bumping" them with my jaw/chin. Not very graceful, but theoretically simple. An additional controller on the side of the right-hand palm controls weapons systems.

I'm planning to use this for an Iron Man-esque suit of my own, which I haven't picked out a name for (thinking of DIESEL, but I can't quite cram a useful acronym in that - yet). I only have the baseline sketches for the arm and helmet done. However, I'm looking to make this a system for anybody who wants to build a working exoskeleton, so I'm trying to walk a somewhat compromised line for the sake of modularity - and so other people can plug it in to their own projects. In a perfect world, I would write a system and provide the neccessary hardware specs for an operating system of sorts, and folks could add their own skins for whatever cosplay they wanted to integrate this into.

Just some basic ideas for my own personal suit:
IG1.png

Oh, and I came up with the following rough sketch of the new "general mode" HUD.

HUD_G.png

I ended up using two heading bars - one will show the chest orientation (bottom), and the other will show the head orientation (top, and main). That way I can use a very vague triangulation system if I ever wanted to. The "Range" bit in the bottom left-hand corner will show the rangefinder data from those calcs, but that's a long way yet. For now, I'm going to pop this into Inkscape (after I go hunt down some tacos), and see which elements I can get knocked out this weekend for use in the HUD.

~Lexikitty
 

mrjbarl1

Active Member
This is pretty much exactly what I got made fun of for saying that I plan to equip my suit that I'm building out of steel with this same stuff (give or take a few personal ideas).

This is cool tho, you should take a look at my thread and see if you can figure out how I did some of my little tricks.

Keep up the good work!


Sent from my iPhone using Tapatalk
 

Murdoch

Sr Member
RPF PREMIUM MEMBER
Hm. Not sure what sensor you're referring to. I know of the "tongue mouse" contraption for people suffering from mobility problems, and some Googling just now found me a device for TMJ analysis and a jaw tracking/facial capture rig, but the former is a bit too large, and the latter requires a camera aimed at the face. Do you know any other details about it? Anything that works on Arduino I can get working on a RPi, more or less. I was planning on using three arcade-type lever switches mounted to the inside chin of the helmet, and just "bumping" them with my jaw/chin. Not very graceful, but theoretically simple. An additional controller on the side of the right-hand palm controls weapons systems.

I'm planning to use this for an Iron Man-esque suit of my own, which I haven't picked out a name for (thinking of DIESEL, but I can't quite cram a useful acronym in that - yet). I only have the baseline sketches for the arm and helmet done. However, I'm looking to make this a system for anybody who wants to build a working exoskeleton, so I'm trying to walk a somewhat compromised line for the sake of modularity - and so other people can plug it in to their own projects. In a perfect world, I would write a system and provide the neccessary hardware specs for an operating system of sorts, and folks could add their own skins for whatever cosplay they wanted to integrate this into.

Just some basic ideas for my own personal suit:
View attachment 332467

Oh, and I came up with the following rough sketch of the new "general mode" HUD.

View attachment 332472

I ended up using two heading bars - one will show the chest orientation (bottom), and the other will show the head orientation (top, and main). That way I can use a very vague triangulation system if I ever wanted to. The "Range" bit in the bottom left-hand corner will show the rangefinder data from those calcs, but that's a long way yet. For now, I'm going to pop this into Inkscape (after I go hunt down some tacos), and see which elements I can get knocked out this weekend for use in the HUD.

~Lexikitty
https://www.sparkfun.com/products/11776
This is what i plan on using in my build. I like seeing folks making their own hybrid armors. The creativity is amazing, looking foward to seeing more...GM
 

Lexikitty

Active Member
This is pretty much exactly what I got made fun of for saying that I plan to equip my suit that I'm building out of steel with this same stuff (give or take a few personal ideas).

This is cool tho, you should take a look at my thread and see if you can figure out how I did some of my little tricks.

Keep up the good work!

Thanks! I just saw it, that's some really fantastic work you've done there. I actually remember coming across that thread a while back when I was just wandering around the RPF. I got some sheet metal recently to try to get the feel of it - made a 2-pen/pad holder from something I made and unfolded from Sketchup and decided I simply didn't have the room or equipment to play with metal quite yet. (I live in a fairly small apartment in NYC, so I don't have room to use things like grinders, and I hate the sound of files - it makes my teeth hurt, for some odd reason).

1451352_10152078803931934_1123440507_n.jpg

Eventually, though....I'll have my fun. Once I have a garage.

https://www.sparkfun.com/products/11776
This is what i plan on using in my build. I like seeing folks making their own hybrid armors. The creativity is amazing, looking foward to seeing more...GM

Thanks kindly! That is an AWESOME find on the sensor front. Definitely useful for weapons systems. I'd previously been planning to just use two tilt sensors (2 bucks each) arranged in a configuration so that when the arm is at the desired angle, the weapons controls activate. It does require an ADC, but I'll definitely look at it and try to work it into the budget, if for nothing else but pure awesomesauce.

~Lexikitty
 

darkhawk

New Member
Many interesting ideas.....how is the frame rate with the Raspberry Pi?
The issue I ran into was that the frame rate tanked as soon as you started trying to display a ton of information over top of the video.
Or it could be that I'm just not using it properly.

I do really enjoy the HUD though, and I considered it myself as well in my own suit that I'm building. I ended up using a Myvu Personal Media Viewer because they were cheap (~$50 on ebay), easily modifiable to fit inside the helmet, and still allowed the option to see through the eyes. I'll be using a Raspberry Pi (for now....this may change eventually once something more powerful is available), along with a number of arduino boards.

I found a modified version of Adafruits Python code for the point & shoot camera using a Raspberry Pi. I have modified that code a bit to add in the ability to record video, aside from taking pictures, as well as to play a quick music playlist (my suit WILL play music....).

If you'd like to take a look, I keep all the Iron Man related posts I make public...you might have to wade through some other things, but most of it is Iron Man build related...
https://www.google.com/+AaronLunger

Again, I really love the HUD idea, but my programming experience in that realm is quite low yet (learning isn't a priority while building the suit yet....just getting it functioning is). I'd love to see the code at some point.

Also, as far as everything is concerned, my chest will have an Arduino Mega attached that has 10 buttons (1 in each finger, you can see the gloves and buttons in my G+ post), and it controls pretty much everything. From the color of all the LEDS, to the opening/closing of the helmet, to activating the hand blaster(s) and chest blaster, and eventually even the gauntlet opening/closing and firing whatever weapon I desire in there.

I am also going to include a Wii Nunchuck for use as a mouse with the Raspberry Pi. I'm not sure if I'll be using the Accelerometer or the actual Joystick to control the mouse pointer on the Raspberry Pi, but it will be what I will use for it. At least until I consider a system that is more powerful (perhaps a Core i3 PC embedded board eventually?).
 

Lexikitty

Active Member
Many interesting ideas.....how is the frame rate with the Raspberry Pi?
The issue I ran into was that the frame rate tanked as soon as you started trying to display a ton of information over top of the video.
Or it could be that I'm just not using it properly.

I do really enjoy the HUD though, and I considered it myself as well in my own suit that I'm building. I ended up using a Myvu Personal Media Viewer because they were cheap (~$50 on ebay), easily modifiable to fit inside the helmet, and still allowed the option to see through the eyes. I'll be using a Raspberry Pi (for now....this may change eventually once something more powerful is available), along with a number of arduino boards.

I found a modified version of Adafruits Python code for the point & shoot camera using a Raspberry Pi. I have modified that code a bit to add in the ability to record video, aside from taking pictures, as well as to play a quick music playlist (my suit WILL play music....).

If you'd like to take a look, I keep all the Iron Man related posts I make public...you might have to wade through some other things, but most of it is Iron Man build related...
https://www.google.com/+AaronLunger

Again, I really love the HUD idea, but my programming experience in that realm is quite low yet (learning isn't a priority while building the suit yet....just getting it functioning is). I'd love to see the code at some point.

Also, as far as everything is concerned, my chest will have an Arduino Mega attached that has 10 buttons (1 in each finger, you can see the gloves and buttons in my G+ post), and it controls pretty much everything. From the color of all the LEDS, to the opening/closing of the helmet, to activating the hand blaster(s) and chest blaster, and eventually even the gauntlet opening/closing and firing whatever weapon I desire in there.

I am also going to include a Wii Nunchuck for use as a mouse with the Raspberry Pi. I'm not sure if I'll be using the Accelerometer or the actual Joystick to control the mouse pointer on the Raspberry Pi, but it will be what I will use for it. At least until I consider a system that is more powerful (perhaps a Core i3 PC embedded board eventually?).

Oh, very cool. *squees with excitement* Very, very cool. See, I tinkered with the Adafruit camera code some too, but I found that the camera is best used in it's native hardware state - that is, being piped directly into the RPi framebuffer instead of sending any data in and/or out to any of the running Debian processes. Using the -op switch, you can create an overlay without delving too far into Python or OpenGL. Here's an example of it:


What I'm unashamedly doing is cheating, by setting the switch -op, you change the opacity of the video feed injected into the framebuffer. This allows you to use black as a fake "keyframe", and as long as your lines are good and solid in your HUD overlay, they'll show through, even though you are technically pushing the HUD through the camera feed, instead of laying it on top. Sneaky but effective.

I'm also having the internal RPi/PC debate. Currently I'm cheating with my massive desktop rig just so I can get on with development, but I'm worried about Processing (my weapon of choice for the more recent HUD video, seen in the first post) having any sort of decent framerate on the Pi. If it fails miserably I've got an Acer W500 tablet that should be able to run it, and I can probably salvage more horsepower from elsewhere if needed. Only problem with that is the loss of a direct-framebuffer feed, but at the win of, oh, I don't know, not ripping your hair out trying to get above 1.5 FPS. For now, development in Processing, since it handles 3D elements straight out of the tin and makes for an overall better visual language than Python.

And I'm currently only using one Arduino as more of an ADC than anything, via the serial interface. Everything else is routed over the I2C bus, makes it much more fluid and expandable without having to worry about GPIO pins. Also, less wires running through an already constricted space. Bit more complicated to write for though, especially when all you need is "just one sensor right there, dangit".

Using the Numchuck is a novel idea. I've found the mouse to be kind of hard to use as a control device for utilitarian things like this, though. I'm trying to keep things to either GPIO bindings or keymappings using a Trinket as a fake keyboard.

And all the code will be up on GitHub once I actually figure out how to properly use GitHub. And remove notes to myself like //this works but in a stupid way, fix it.

Oh, and I love your shop. It's simply gorgeous. Oh when I finally have more than 10 feet square to store all my worldly belongings (including my shop)...that'll be a lovely, lovely day.

~Lexikitty
 
Last edited by a moderator:

darkhawk

New Member
Yeah, I'm an Electrical Design Engineer by trade, so most of the hardware side is pretty much just what I do and so easy to do.
The coding side, well, I only picked up Python coding about 1 year ago due to work, and it was much simpler than trying to use pygame or anything else.

The only reason I like using the arduino, is because the RPi can't handle some of the work I required in my suit. If you look at Adafruit, all the LED's that I use on my suit are NEOPIXEL's, which are absolutely amazing for lighting anything. The RPi can't handle writing data to them, but arduino can just fine.
Now, I could use SPI or I2C to have the RPi write data to the arduino to make it change color and do sequences, but I preferred to just let the arduino do it all itself. It was simpler. The only interaction I'm going to have (for now) between the arduino and RPi is to have the arduino relay button pushes to the RPi for changing the music selection and start/stopping the playback. Outside of that, the RPi is mainly for display, as well as for taking pictures and recording video. That was my original intention and main hope for it. The HUD idea was just a 'well, if I can do this...maybe?' thought.

The 'shop' is actually the lab I work in at work. During my lunch breaks I usually work on the hardware side of the suit. I then work on the software side at home in my spare time.

As far as wires....I just gave up on not having wires everywhere. I considered wireless, but I didn't want the 'chance' of having anything hacked (really thinking like Stark would I suppose).

I really did consider the HUD and sensors you have slated though, but gave up due to a very restricted time frame.
The other issue I ran into was for COSPLAY, it really didn't have a use, other than for a WOW effect if people could see it. So for now, I'm opting for the low tech approach, and once I get all the other external junk done and working very well, I'm going to upgrade it all.

I'll keep this bookmarked. I really love what you're doing.

If you don't mind me asking, what resolution is the HMD that you're using? I really want something a bit better (640x480 would be awesome, and 720P would be fantastic, but at $600, I'll pass that), but the biggest issue is just cost and size.
 

Lexikitty

Active Member
The HMD I'm using is the front half of an Olympus Eye-Trek 700 with a resolution of 720x480. It actually came off of one of my older JORDY units (Joint Optical Reflective DisplaYs) that I used to use to read sheet music and look at small things (I went to conservatory for violin performance [ended up in IT, lol], and I'm legally blind). The practical upshot of all this is that if you look for used Version 2 JORDY units on Ebay, you can get a HMD with it's own replaceable battery pack and visual controls for about 70-100 bucks if you time it right. It even has a built in high-zoom camera, but I haven't begun hacking the camera bit, but that would be super useful as well. The Version 1 of the JORDY sells for even cheaper, if you can find them, but it has a resolution of 640x480. Brighter screen, though.

I actually use Trinkets for the NeoPixel assemblies. Still self-contained, but good enough to run a NP Ring, a 3W center LED with a MOSFET, and report back on the I2C lines. And you still have one analog read left over, because why not. That way I only have to run 4 wires down the arm for anything that needs arm control - SCL, SDA, VIN, and GND. Also, if the RPi freezes or dies, the repulsor will still be a self-sustaining system - something I'm trying to make every limb be, just in cases.


And yea, I fell more on the "realism" side of the mental debate. I originally started with the cosplay idea, and then realized that if I was going to do this, it might as well be worth overdoing. I'll save the simple approach for all my other cosplays, for the sake of my sanity. And this project will probably take....well, on the upper side of months, possibly a year or two. I also don't have a ton of space (as I've mentioned), or tons of building experience (some, but not a ton), so working on the techy bits keeps me inspired. I did just get a small 3D printer though, so I'm at least making physical bits for the arms now. :)

Oh, and I'd advise against having Python parse your data from the Arduino over I2C or anything like that. I haven't tested USB serial, but over I2C Python took almost a full second to register a button press, load a 2-second "repulsor fire" MP3, play it, and go back to polling the I2C bus. Doesn't sound like a lot, and might be find for music, but sound effects linking up with NeoPixel effects got all flavors of wonkyness. Just my experience. *shrug*

~Lexikitty
 
Last edited by a moderator:
This thread is more than 7 years old.

Your message may be considered spam for the following reasons:

  1. Your new thread title is very short, and likely is unhelpful.
  2. Your reply is very short and likely does not add anything to the thread.
  3. Your reply is very long and likely does not add anything to the thread.
  4. It is very likely that it does not need any further discussion and thus bumping it serves no purpose.
  5. Your message is mostly quotes or spoilers.
  6. Your reply has occurred very quickly after a previous reply and likely does not add anything to the thread.
  7. This thread is locked.
Top