On GStreamer performance with multiple sources.

I’ve made a couple of experiments with Tetra. Right now the code that manages disconnection of live sources (say, someone pulls the cable and walks away with one of our cameras) kind of works, it certainly does on my system but with differnet sets of libraries sometimes the main gst pipeline just hangs there and it really bothers me that I’m unable to get it right.

So I decided to really split it on a core that does the mixing (either manually or automatic) and different pipelines that feed it. Previously I had success using the inter elements (with interaudiosrc hacked so its latency is acceptable) to have another pipeline with video from a file mixed with live content.

Using the inter elements and a dedicated pipeline for each camera worked fine, the camera pipeline could die or dissapear and the mixing pipeline churned happily. The only downside is that it puts some requirements on the audio and video formats.

Something that I wasn’t expecting was that cpu utilization lowered, before I had two threads using 100% and 30% (and many others below 10%) of cpu time and both cores on average at 80% load. With different pipelines linked with inter elements I had two threads, one at 55% and a couple of others near 10%; both cores a tad below 70%.

Using shmsrc / shmsink yielded similar performance results but as a downside it behaved just like the original regarding the sources being disconnected, so for now I’m not considering them to ingest video. On the other hand latency was imperceptible as expected.

Using the Gstreamer Controller subsystem from Python.

This is more or less a direct translation of the examples found at gstreamer/tests/examples/controller/*.c to their equivalents using the gi bindings for Gstreamer under Python. The documentation can be found here. Reading the source also helps a lot.

The basic premise is that you can attach a controller to almost any property of an object, set an interpolation function and give it pairs of (time, value) so they are smoothly changed. I’m using a pad as a target instead of an element just because it fits my immediate needs but it really can be any Element.

First you need to import Gstreamer and initialize it:

#!/usr/bin/python
import gi
import sys
from gi.repository import GObject
gi.require_version('Gst', '1.0')
from gi.repository import Gst
from gi.repository import GstController
from gi.repository import Gtk
from gi.repository import GLib

GObject.threads_init()
Gst.init(sys.argv)

Then create your elements. This is by no means the best way but lets me cut a bit on all the boilerplate.


p = Gst.parse_launch ("""videomixer name=mix ! videoconvert ! xvimagesink
videotestsrc pattern="snow" ! videoconvert ! mix.sink_0
videotestsrc ! videoconvert ! mix.sink_1
""")

m = p.get_by_name ("mix")
s0 = [pad for pad in m.pads if pad.name == 'sink_0'][0]
s0.set_property ("xpos", 100)

Here I created two test sources, one with bars and another with static that also has an horizontal offset. If we were to start the pipeline right now ( p.set_state (Gst.State.PLAYING) ) we would see something like this:

captura_testinterpolation

So far it works. Now I’d like to animate the alpha property of s0 (the sink pads of a videomixer have interesting properties like alpha, zorder, xpos and ypos). First we create a control source and set the interpolation mode:

cs = GstController.InterpolationControlSource()
cs.set_property('mode', GstController.InterpolationMode.LINEAR)

Then we create a control binding for the property we want to animate and add it to our element:

cb = GstController.DirectControlBinding.new(s0, 'alpha', cs)
s0.add_control_binding(cb)

It is worth noting that the same control source can be used with more than one control binding.

Now we just need to add a couple of points and play:

cs.set(0*Gst.SECOND, 1)
cs.set(4*Gst.SECOND, 0.5)
p.set_state (Gst.State.PLAYING)

If you are not running this from the interpreter remember to add GObject.MainLoop().run() , otherwise the script will end instead of keep playing. Here I’ve used absolute times, to animate in the middle of a playing state you need to get the current time and set the points accordingly, something like this will do most of the cases:


start = p.get_clock().get_time() # XXX: you better check for errors
end = start + endtime*Gst.SECOND

Avoiding too much bookkeeping

You can get the controller and control source of an element with:

control_binding = element.get_control_binding('property')
if control_binding:
    control_source = control_binding.get_property('control_source')

Installing a NextWindow Fermi touchscreen under Ubuntu 13.04 (Raring)

So, last week we bought an HP AIO 520-1188 to use with Tetra. It is a really nice machine, wonderful sound and display quality, very easy to disassemble. It came with an integrated tv tuner, infrared control and wireless keyboard and mouse. Strangely, it used only the necessary amount of packaging.

To actually use the touchscreen one needs to install the nwfermi packages found at https://launchpad.net/~djpnewton.

The kernel driver is managed with dkms, for it to build I replaced the ocurrences of err with pr_err and commented out the call to dbg(). The sources are installed by default at /usr/src/nwfermi-0.6.5.0. After that changes do a

dkms build -m nwfermi -v 0.6.5.0
dkms install -m nwfermi -v 0.6.5.0

The xorg input driver needs to be recompiled as the last version on the ppa is for a different ABI version of Xorg. I grabbed the sources from https://launchpad.net/~djpnewton/+archive/xf86-input-nextwindow/+packages.

The requisites to build it are installed with:

apt-get install build-essential autoconf2.13 xorg-dev xserver-xorg-dev xutils-dev

(In the guide it says to install xorg-x11-util-macros, its contents are now in xutils-dev)

After that do
chmod +x autogen.sh ; ./autogen.sh
make
make install

The old (and nonworking) driver is still present, so we remove it:
rm /usr/lib/xorg/modules/input/nextwindow_drv.so

Reboot the system and you are set to go.

The provided debs worked fine with a stock Debian Wheezy.

I had no luck in making the userspace daemon work on a 64 bit distro (so for now I’m limited to a tad less than 4G of ram), but I think it’s a matter of time.

Gstreaming…

For a little more than a month I was working with GStreamer on a cool project. Almost everybody told me that GStreamer is really nice if all you want to build is a player but things tend to get difficult really soon for other uses.

For the first week I struggled to do even the simplest stuff but after that it became quite manageable and I barely had to think. Except when dealing with dynamically removing and adding elements. And renegotiation errors. Fuck. I remove a source. I add another one, exactly like the former, and bam! “streaming task paused, reason not-negotiated (-4)”. Bummer. I resorted to go PLAYING – READY – PLAYING but it feels plainly wrong.

Also, I don’t know the difference between sinc, sync and sink anymore.

Lazy afternoon at the foundry.

It’s not my fault if you start caring about grid systems, typefaces or history of writing instruments.
Design Fundamentals for Developers (MIX09):
The best three hours I spent this weekend. This is a workshop given by Robby Ingebretsen. Check out his original post at http://nerdplusart.com/mix09-design-fundamentals-for-developers and grab the videos and slides from:

Part 1: Process
Part 3: Composition
Part 3: Visuals

Design Fundamentals for Developers: Slides

Random snippets:
The process of design is used to bring order from chaos and randomness.
Sorta like art & has a black shirt.
The alternative to good design is bad design, not no design at all (Douglas Martin).
Authenticity is invaluable; originality non-existant (Jim Jarmusch).
There is no color that is better than black. […] To me, black is black and red is color. (Massimo Vignelli).

Typography in 8 and 16 bit systems:
Discussion about the fonts used in the most influential systems of yesterday (and the fonts in ttf format available for download).

Typography in 8 bits system fonts
Typography in 16 bits system fonts

The 60s at grain edit:
This is where I go when I need something to inspire me http://grainedit.com/tag/1960s/

Camino a la CISL 2011

En unas horitas me voy a la Conferencia Internacional de Software Libre a presentar con los Compañeros de Crear el prototipo de nuestro último proyecto, “Guitarra Vas a Llorar“.

La idea central es agregar un dispositvo electrónico a una guitarra estándar capaz de mostrar acordes, escalas y canciones sobre el diapasón. Mediante LEDs (diodos emisores de luz) se indicará en donde el guitarrista debe apoyar sus dedos. El funcionamiento se completa con dos alternativas de uso:

  •  Manual: Uso de una botonera y un pequeño LCD desde donde se podrá seleccionar que mostrar sobre el diapasón de la guitarra (escalas, notas, acordes )
  •  Automática: Conectarla mediante un cable USB a la computadora. Y a través de un software reproducir una canción, lección de estudio o una secuencia de acordes y que ésta se muestre en la guitarra.

Más info del proyecto:
http://www.elarteylatecnologia.com.ar/spip.php?article60

Programa de la Conferencia Internacional de Software Libre:
http://www.cisl.org.ar/index.php?option=com_content&view=article&id=373&Itemid=466

Poesía electrónica.

spam loco que me llegó de China.

“Most impresarios believe that asteroid behind bur near corporation.He called her Charity (or was it Charity?).tape recorder over meditates, and from tenor leaves; however, espadrille from admonish..Still host her from lover inside, bestow great honor upon her defined by diskette with related to defendant.Most dahlias believe that inside support group assimilate over haunch.”

Interfaz de potencia

Recién terminé de rutear la plaqueta para un proyectito que empezamos con Crear. Nada de otro mundo, un registro de desplazamiento, optoacopladores y triacs. Usé la suite gEDA junto con unos scripts de Kai-Martin-Knaak. En el portátil tarda algo para hacer el render “fotorealístico” pero para mostrar el producto aún antes de que exista son geniales.

Interfaz midi simple con arduino en media hora.

Hace un par de semanas con los compañeros de Crear compramos un Arduino (y un montón mas de cosas en adafruit), llegó muchísimo mas rápido de lo que esperábamos. Mientras bajaba el ide empecé a trastear y en media hora ya estábamos controlando Ardour. El sketch es recontra simple, está basado en arduimidi/ttymidi

El Tocador.

Resulta que estaba en el FLISol 2011 de La Plata y bajo al primer piso para ir al baño. Estoy por entrar y un muchacho que salía me dice cuidado. Ambos retretes están ocupados así que espero haciendo morisquetas frente al espejo para matar el tiempo.

De fondo se oía un thud-thud-tap-thud-thud… Pensé que era un retumbe de la exposición del Hall pero luego me pareció muy familiar, y lamentablemente cercano. Alguien estaba preparando la cena de Acción de Gracias, comenzando por estrangular al pavo.

Carraspeo de una forma bastante evidente. – Si voy a pasar por un momento embarazoso no quiero estar solo, aunque no creo que le importe mucho a quien estaba dentro del cubículo. – Chirrido, puerta que se abre y el horror. Un ser bastante parecido a Soldán, me dice con voz algo espasmódica “pasá pibe, ya casi termino”. La próxima vez cruzo hasta el San Martín. No me importa que desinfecten cada dos horas.

Controlling an LB1946 via usb with an AVR

The LB1946 is a very nice chopper driver, the only downside is that it has a serial interface. So, for a project I need to control some steppers. I have a box full of printer boards, picked this one from and old epson inkjet because it had the same chips and also looks like I can chop it in half with a hacksaw and use them as they are.

The logic and the original power supply were toast so I used another I had at hand. To make some tests I hacked the PowerSwitch circuit from Objective Development to send raw commands and spin the motor with an usb interface.

At the end it doesn’t look quite like the original but it works. Still can’t get more than 1000 steps per second but I think that’s because of the supply. Now that I have the basic code working I’ll have to make a parallel interface so I can use them with EMC.

Sourcecode / References:

Pics: