Skip navigation

Category Archives: software

On the everlasting quest for entertaining content for my RGB LED matrix, i stumbled upon twitter. an endless stream of information that “just” needs to be parsed and turned into light. starting off with something simple, i decided to extract people’s favorite colors. i found a nice article on using the twitter search API in python and used it as a base for my script.


here’s what it does: first, it searches for the string “my favorite color” and in practice finds around .1-5 tweets a minute like (totally random samples shown)

my initial plan was to use the wikipedia list of colors as a lookup table, when i realized that people are commonly not very creative when choosing their fav color: it’s mostly one of about 10 different colors. so i compiled my own small lookup table from my experiences of staring at tweets:

colorlist = ["grey","lime","white","violet","yellow"...

the script simply cleans up the tweet, removes any strange character like “#” and compares every word of the tweet with my lookup table. the first color string in the tweet wins.
after that, the string is converted into predefined RGB values and weighted with the tweet length to generate some more diversity. the longer the tweet, the brighter the color.
that’s basically it. now every time somebody twitters his/her favorite color, a pixel is slowly fading in to the LED matrix, forming a color spiral in the end.

the result is a very relaxing effect that somehow connects my room to the internets out there in a very abstract way. there are color trends that change over the day and the stream of retweets forms longer traces in the same color. you get an idea of what’s going on in the “favorite color” corner of the twitterverse without obtaining too much information. obviously you could generate a similar effect by choosing random colors and times. but if you look at the display and you know that if a pixel lights up, somebody in some place in the world initiated this by tweeting his/her favorite color and is totally unaware of this project, it feels awesome.

so go ahead and check out the project files to use twitter as a data input for your next project:

))) project files (((



In the beginning of 2011, i was asked to create some light effect for electronic music partys. it had to be robust and simple, the budget was just 200 Euros. my first thought was obviously an LED matrix. but as i experienced in my former matrix projects, these things can be expensive. after a short brainstorming, we came up with the following concept: we decided to build single panels that contain five RGB LEDs in a row. these panels can be mounted on the ceiling and are either distributed in the room or combined to form a matrix. the design was kept very simple and therefore cheap, which allowed us to build a few panels with the budget and extend the project if more money is available.

time was short, so we went for five panels to end up with a 5*5 matrix at the first party. we bought slats, stapled them together and ended up with ladder-like constructs that were 3.3 m long and 0.4 m wide. some fabric was used as a diffusor. an ATmega168 and some transistors on a breadboard control the five LEDs and the thing looked like this:


as you can see, the angle of radiation of the LEDs seperates the singe ‘cells’, making it possible to display clear ‘pixel’ images later on. since a breadboard is not the most robust solution for electronics and manufacturing PCBs would have been to time-consuming/expensive, i used perfboard and throughhole components. i routed the board in Cadsoft Eagle and used a CNC mill with a pen as a plotter to draw the traces onto the boards before soldering. this made it easy to reproduce them and place the mounting holes. no more messing around with mirrored Eagle printouts and forgotten traces. optimized like that, soldering went pretty fast, about an hour for a complete board.

plotted pcbs

so what’s on that board? as the brain, i still use the ATmega168. ULN2001A darlington arrays drive the LEDs and a 1489N RS232 receiver changes the +-12V signal to 0-5V. the voltage regulation is handled by a 7805, dirty but inexpensive. the complete board costs around 10 Euros. as connectors for the power supply, i chose 4 way MOLEX power connectors because they are cheap, robust and reverse polarity protected. for the data line, standard SUB-D9 connectors are used. the boards are designed to be daisychained, so power/data in on one side and power/data out on the other side.

ajolicht pcbs

power is supplied by an old ATX PSU. i use the 12V line and regulate down to 5V on the single boards. this is done to overcome voltage drops when using a long power line and many boards.
the data stream that controls the panels is generated by an old laptop running some python scriptage on a linux system.

AjoLicht matrix half hanging

all panels share the same RS232 line. this is possible because they only receive. each panel has it’s own address and can thereby be controlled individually, so it does not matter if you want to control one or fifty panels. the firmware on the panels handles the datastream, generates the PWMs via binary coded modulation and performs a gamma correction.

the protocol for data transmission is rather simple:
'A',address,15 bytes of data (5xRGB)
represents a data package for one panel, 17 bytes in total.
here’s a video of the finished five-panel, 3*3 m matrix:


the panels so far survived 3 partys, rough handling and beeing stored under bad conditions.
so yes, this is the billionth LED matrix, but this time, it’s large scale and really cheap, easy to build and makes a cool illumination for partys. and it always surprises me how you can still amaze people with a bunch of blinking LEDs.

specs in short:
* panel is 3.30m x 0.4m
* costs for each panel: 25 Euros
* 5x superflux RGB LED per panel
* RS232-bus
* 24 bit color depth
* ~80 FPS @ 5 panels in a matrix

feel free to download the source files including firmware sources, example python script and eagle files.

))) project files (((

if you have any questions, please do comment.

A few weeks ago, i was deep into Solid Edge 3D CAD modeling for some mechanical stuff. our CAD computers are equipped with 3DConnexion 3D mice. this 6-DOF input device makes navigating in CAD software a pleasure. you can rotate and translate the objects on the screen while using a normal mouse in your other hand to manipulate them. anyways, once you get used to this, you cannot CAD without it. seriously.
now at about the same time i had the pleasure to design some PCBs in CadSoft Eagle. my left hand kept moving the 3D mouse in order to pan and zoom the view, but of course nothing happened. unfortunately, Eagle does not support 3D mice.

yesterday i found some time to do a dirty little hack in order to use my space mouse with Eagle. it took me two hours and somehow works. i spare you the details. just take a glimpse on the video.
this post is not about the actual hack. that’s just a proof of concept.

primarily i’d like to ask you, fellow space mouse and Eagle users, if you ever wished to navigate in eagle using your 3D mouse.
secondly i’ve got a message going out to the people at CadSoft: i like your software very much and it really improved my electronic design skills a lot. thanks indeed for that. but you’d be my all time heroes if you add support for 3D input devices to Eagle in the next release. come on, it’s not that hard. please?

for those of you who really want to know how i did this, here’s the story. first of all, 3DConnexion offers a lot of information, examples and support for people who want to use their input devices in own applications and of course for companies who want to integrate 3D mice into their software. among a developers forum and their SDK download page, developers can download example codes at the 3DConnexion FTP-server (user:examples/pw:examples).

i used a .net example to write a small software that reads the 3D mouse. to interface CadSoft Eagle, i used the WINDOW (@); command bound to a hotkey. this Eagle command centers the view to the mouse cursor. when i move the 3D mouse, my readout software moves the cursor away from the view center (you can see that in the video) and triggers the hotkey. the more i push the mouse, the greater the distance between cursor and view center. this results in a higher panning speed.
zooming unfortunately cannot be done continuous, because it is triggered by hotkeys that zoom in or out a certain amount as the z-axis of the mouse exceeds a threshold.
this is all very dirty, i feel bad about it, do not try this at home, kids. although it somehow works, i have not really tested if it is usable when actually working in Eagle. a problem is that because i use the cursor for navigation, it cannot be used for manipulation at the same time. if you for example want to move a component, grab it with the move command and then use the 3D mouse for navigating, the component is moved too. another point is of course the bad interface between my readout software and Eagle. it is surprisingly fluid, but not comparable to a build in 3D mouse support.
talking about build in support, are you aware that panning and zooming only uses 3DOF of a 6DOF input device? that’s 3 rotational DOF that can be used for example to rotate parts or other awesome features.

While relaxing on a beach in spain back in 2006, an idea came into my mind. i wanted to build an LED display. fullcolor and large. no large resolution, just large in terms of dimension. in december 2006, i made actual plans for the project. from there on, i experimented, prototyped, programmed and soldered from time to time. having no deadline made it a real long time project. but then in 2009, i had my 10×10 pixel RGB LED matrix, a square meter of color and light.

matrix with its creator

i guess most of you are particularly interested in some facts about the hardware and software. first the hardware. i used 100 Superflux RGB LEDs with an angle of radiation of about 100°. the LEDs are dimmed via 8-bit PWM, generated by ATmega8 microcontrollers. each controller is responsible for 4 LEDs, which makes it a total of 25 controllers, running at 14,7456 MHz. there are 4 controllers on each PCB, the outputs are amplified by darlington arrays.

guts of the matrix

every LED has it’s own small board, including resistors for each color channel. the pixels are separated by a grid made of 4mm plywood. the light in each pixel is diffused by a small piece of air filter material and the frosted plexiglas pane. diffusion was a major problem, and i guess it’s not really possible to achieve perfect diffusion. my solution is a tradeoff between good diffusion and complexity.

pixels seperated by a grid

the data comes from a PC via RS232(USB adapter) at 460800 baud. all controllers read the RS232 line simultaneously and are addressed by a reserved byte in the data stream. so i broadcast the data to all controllers and each one picks the data it is supposed to read. i’ve reached frame rates beyond 100 FPS.

the final software was written in JAVA because of the OS independency. it is still in development and will probably always be. at the moment it is capable of playing back animations which are stored in bitmaps, displaying the game of life and some variations of it, simple particles and several colorful effects and filters. and of course multiplayer tetris with overlaying playfields. can drive up to three sane persons really nuts.
the matrix was always supposed to be some entertaining decoration element, so it has to be able to generate an endless variety of content without steady user inputs. so i let the software surf the internet and jump from link to link. on every website, it collects content. at the moment, its just an image of the whole website, but i plan to analyze for example the text on the website. the image of the website is analyzed to get it’s n main colors, which are then the basic colors of graphical effects. i hope to end up with some kind of AI which analyzes the web in-depth and shows a simulated creative behavior in dealing with forms, colors and movement. but that’s an utopia right now and it’s gonna be a long way.
another plan was to add some kind of interactivity, but the design is not optimized for adding sensors. maybe a webcam could be used as a proper input-device.
future plans for the hardware include the integration of a netbook to have a completely standalone device which connects to the internet via wifi or ethernet. i considered some embedded solutions, but as old netbooks get cheaper and cheaper, this would be a reasonable solution

if you want to have detailed information about the development and the building process of this project, please visit the project’s own blog (german)

%d bloggers like this: