Dell XPS 13 9360 – Trackpad Issue

Several days ago the trackpad stopped working. It started when I connected a Apple Magic Mouse for more convenient playing. When I disconnected the mouse, I realized that the trackpad stopped working. Thinking the laptop just swallowed up I restarted the system. And sure enough, the trackpad worked again… for some hours. After that I just stopped working again. Sometimes it repeaired it self to the point that I could use it, but no multi touch gestures (pinch, scroll etc) would work. I had a look in the Hardware-Manager of Windows and saw a yellow exclamation mark next to a I2C HID-Device. The error message states “A request for the HID descriptor failed”, what ever that means.

Error Message in device Manager
Error Message in device Manager

Unfortunatelly there is no solution known to me. In the Dell forum two threads (one, two) describe my problem yet nothing seem to solve their issue. I’ll see the Dell support for this :-/

Travel Laptop

When we started our tour three month ago, I replaced my trusty Lenovo Thinkpad T520 with a shiny new Dell XPS 13 (9360). The reasons where many fold: As weight and volume are premium while travelling half the weight and less than half the volume is awesome. As I didn’t intended to do a lot computer stuff, I settled for a 13 inch screen instead of a 15 inch screen. The other main reason is definitely the battery life. 10 hour of light work or browsing is easily accessible. My previous laptop seldom stayed alive longer than three hours, even with the bigger battery pack.

Obviously I had to make sacrifices somewhere: I already mentioned the smaller screen. Also, I traded power with battery life. Before I had a Intel i7 Quadcore. Now it’s “only” a dual core (although a faster one). I also had to omit a dedicated graphic unit which I had before. But my main drawback is the missing trackpoint. I know people either love or hate them and I loved it.

Out of the different configuration I had chosen the 8Gb, Core i5, Full-HD No-Touch display with 256Gb disk and Ubuntu pre-installed. Because I did not know if I would use this laptop after our travelling as much as while we were travelling, I tried to keep the costs as low as possible. Although I knew I would install Windows, I used the Ubuntu installation a while. A quite nice experience I have to admit.
Nonetheless I installed Windows. After some start problems (the WiFi card was not supported from scratch by windows, so I couldn’t download other drivers…) I installed Windows 10. I also installed “Dell Update” for up-to-date drivers.

(Not really a) review
I would love to do a review of my machine, but I do not use it enough to call it a real “review”. I use it for to keep track of my balance, booking of tours and for some light gaming. Also the occasional programming task when ever an idea struck me. Nonetheless can I say that I am (was) very happy with the laptop. The battery life is fantastic. The display is crisp and the fan spins up only while gaming (or Windows Update running wild…). The bezel of the display is slim and gives the laptop a very good size to display ratio. smaller drawbacks are the missing “roll” button (which I never use, but was activated on Windows by default; Use on screen keyboard to toggle it) and the Fat-Chin-Camera (The camera is located in the lower part of the bezel).

So all in all: A good laptop and I’m looking forward to give Ubuntu another try when I’m back 🙂

Solved: ImageJ / Fiji error: “IllegalArgumentException: adding a container to a container on a different GraphicsDevice”

For my master thesis I am experimenting with ImageJ / Fiji. While working with some image registration algorithms I happend to run in a strange bug I couldn’t explain to me:

Workaround / Fix:

If you happen to have this bug make sure all instances of ImageJ / Fiji windows are on the same monitor. This bug only occured to me when my Fiji toolbar window was on my main monitor and the plugin windows (Image Sequence loader and Linear Stack Allignment with SIFT) were on my secondary screen.


If you want to reproduce this issue, use a system with the above mentioned spec (see stacktrace).

  1. Open Fiji on the main display
  2. Use “File / Import / Image Sequence…” to load at least two images
  3. Execute “Plugin / Registration / Linear Stack Allignment with SIFT” on the images
  4. When the registration is performed, move all Fiji windows but the Fiji toolbar to a second display
  5. Close all the Fiji windows but keep the Fiji toolbar open
  6. Repeat step 2) and 3)
  7. Exception gets thrown


Related bug in the ImageJ forum

Bug on the Oracle bug tracker


Elixir for aspiring Erlang developers

For the “Independent Coursework” of the University of Applied Sciences Berlin I created the following presentation:


Target audience are students of the bachelors degree Computer Science of the University.

If anything is unclear (or god forbid, wrong) drop me a mail or a tweet

Powershell cmdlets with dynamic param AND $args don’t work

Over the weekend I tried to implement auto completion in Elixir’s  mix (in Windows). Unfortunately I didn’t make it without introducing some problems. So I didn’t committed my changes to upstream. Currently I try to reach some of the more renowned Elixir/Windows contributors, to discuss the changes .


Under normal circumstances I don’t use more mix tasks then test, phoenix.server and release but sometimes you need this weird command, you just can’t remember. The command mix help  is your friend here as it shows you all available commands (project aware!). Yet I don’t like to look the documentation up, if I need just some information on spelling. For example in the beginning I often tried to start the phoenix project with mix phoenix.start (Hint: that does not work). I am used to auto completion in my development environments so I tried to extend mix  as well.


As I am using the Powershell for all my command line related tasks and the default file extension of Powershell is ps1, my command mix  execute the mix.ps1  in the Elixir bin folder.


Powershell scripts can have auto completion of parameters with an so called [validateSet("Param1","Param2",...)] , which incorporate all valid parameters. Sadly this is of no help, if we have to hard code the possible values for the parameter. A possible solution to this problem is the usage of a DynamicParam with dynamic validateSet (good resource here). To test my various iterations I wrote down all test cases (sorry no automated testing yet).

Iteration 1


If you have a look at the original mix  script (here) you can see that the script locates the mix.bat , flattens possible array arguments (is this still needed?) and then execute the mix.bat  with the newly flattened arguments.

The first problem we see here is the usage of the $args array. As Keith Hill points out in this SO comment the $args array “… contain any argument that doesn’t map to a defined function parameter…”. Which introduces the first problem: The DynamicParam  ONLY works for defined function parameters.

I copied the linked resource (again, here) and moved the old script to the process block. Because we are creating a script and not a function the signature of function Test-DynamicValidateSet {...}  needs to be removed. To generate the validateSet I replaced the line $arrSet = ... with

This populates the $arrSet  with all valid task. I also changed the value of the variable $ParameterName  to 'Task'  and renamed the variable $Path  to $Task


A short test shows, the command mix  does work, the command mix help  does not. Reason for that is, we assign the first value to the parameter $Task .

Iteration 2


The call to mix.bat in the last row now get the $Task parameter as well:


mix  works, mixd help  works. Awesome! Lets try auto completion. mix [tab]  …

This is weird. The auto completion takes it times (this is actually the time mix help --names  takes to return all valid tasks) yet the auto completion fills in file names from that folder… To fix that we need to make it clear, that our dynamic parameter is actually the first parameter. So after we set the $ParameterAttribute.Position = 0  (it was 1) we repeat our test.

mix  works, mixd help  works, mix [tab]  works, mix he[tab]  works also. What about arguments to parameters? like mix help --names ?


Iteration 3


OK, we need positional arguments. Lets add some.

I don’t like that approach, because this script will fail on having more than seven parameters (our dynamic and $p1  – $p6 ) with the aforementioned error message.

We also have to forward our new parameters to the mix.bat :

OK, besides the now unused “flatten possible array parameter” logic and our “it will fail on having eight or more parameters” problem, how good are we?


All tests in the test cases pass. Yet we have some unfinished problems.

Problems with this solution

  1. We can have only a fixed amount of parameters. This is not a big problem (as we can add more parameters in the signature), but this is neither elegant nor good practice.
  2. We now completely omit the “flatten array logic”. I have to admit, I’m not sure if  this is still needed, so I asked the original contributor of this logic but still wait for response.
  3. Most of the code was copied from our resource. We clearly added some of our own logic, yet we probably shouldn’t use this code without asking for permissions. I asked the author if I could use this snippet and wait for a response.
  4. Even if I omit the “flatten array logic” I tripled the Lines Of Code. I don’t know if the auto complete feature is worth this much code (read about code as a liability here)

As soon as the problem 3 is clarified I will upload the file here. As soon as the other problems are clarified (and/or fixed) I create a pull request in GitHub to upstream the changes.

Elixir, Phoenix and Windows: Some insights on my 1000┬Ás problem

On Saturday I wrote about “Elixir, Phoenix and Windows: No faster responses than 1000 microseconds?“. I described two problems I had with Phoenix in Windows: My response times seem to be stuck at 1000 microseconds and Powershell couldn’t display the μ sign. I dug into some code and the mailing lists and found (with a lot of help) some answers.

1000 Microseconds on Windows?

The response times in Phoenix are often measured in Microseconds. Yet on Windows you won’t see any requests faster than 1000 Microseconds. That’s not because of a slower OS, but a not as precise as needed timer:

On Windows a developer has several options to get the system time. According to there are (including the precision):

  • The time service (1 ms)
  • The performance counter (between 1 ms and 1 μs)
  • The GetSystemTimeAsFileTime API (100 ns)

The highest precision (100 nano seconds) can be achieved with the GetSystemTimeAsFileTime API (or the GetSystemTime API, which is the same data but differently formatted).

Actually this is the API  the Erlang VM (which provides the run time for Elixir and Phoenix) is using. So in theory we should be able to get more precise data out of these API. Yet the Erlang VM only returns millisecond as smallest unit. I’m pretty there are reasons for it, but I don’t understand them. If you are curious (and don’t fear a little bit C code) go ahead and look at the implementation of the timing in Erlang: sys_time.c in Erlang VM

1000┬Ás instead of 1000μs?

The second problem I had on saturday was the missing μ sign in my Powershell environment. I got hinted that I have to set the code page of Powershell to UTF8:

This fixed the problem for me. Unfortunately this introduced another, more grave bug to me: On putting out special characters in iex.bat, iex now fails to react completely. Until this bug is fixed, I strongly advise against this fix.

Elixir, Phoenix and Windows: No faster responses than 1000 microseconds?

If you read around phoenix developers you often hear stuff like “Awesome, requests served in under xxx microseconds!”. Yet if I try the Phoenix framework, I only have this results:

With special emphasis on [info] Sent 200 in 1000┬Ás . Here we have two problems:

  1. It looks like the command line doesn’t know the μ sign and replaces it with ┬Á.
  2. Did it really take 1000 microseconds to serve the request? I’m not sure, but I NEVER have a request faster than that. Sometimes I have a slower request (163 milliseconds on startup for example) but never faster than 1000 microseconds.

Locating the error source

Lets find the culprit: I use the Powershell to start the Phoenix app. Can Powershell show the μ sign?

Powershell using the μ sign
It can!
Powershell showing off and uses the the μ sign as variable name
Variables with the μ sign are possible as well

As a matter of fact, you can use the μ sign as variable, if you want to.

So Powershell is good. What’s the problem then? Looking into the mix.ps1 we can see that it executes the mix.bat, which executes the elixir.bat which either executes erl.exe or werl.exe. So lets have a quick look if the cmd (*.bat files are executed by cmd) is capable of the microsecond sign.

Also the command line is can show the sign
Also the command line can show the sign

So the problem isn’t the command line either. I printed the final executed call to erl.exe and executed this command without the cmd as middle man. The problem remained. So I assume it is a problem with either the erl.exe (if I use the –werl command line switch, werl.exe gets executed, the phoenix app gets started but no info is shown in werl.exe), Elixir, Phoenix or some of the plumbing in between.

Lets test Elixir. I created a new Elixir app mix new micro_second_test  and wrote a single function in it:

When starting the application with iex.bat -S mix  and executing the function we get this result:

So we can rule out Phoenix as Elixir already has that problem. What about erl.exe?

Erlang can show the μ sign. So the culprit is either Elixir or the plumbing. I will open an GitHub Issue for this problem. For the second problem (never showing less than 1000μs) I am not sure how to check. I think it could be in the cowboy web server, or in the Plug.Conn. But I have no clue…

Elixir on Raspberry Pi 2 (using Windows)


Deploy an example Elixir application with the Nerves-Project to a Raspberry Pi 2, using an Ubuntu guest in VMware Player on a Windows host.


We are using the Nerves-project, buildroot and Ubuntu in a virtual machine to create a SD-Image, which contains the default application, blinky.


After seeing this talk from Garth Hitchens on Embedded Elixir in Action I wanted to try the Nerves project to deploy an Elixir application to my Raspberry Pi 2. Yet, as I’ve blogged before I’m using a Windows machine and most of the build tools (for Raspberry Pi) are using Linux.


We will need to install different software to our computer, therefore I assume you have the rights to do so. Also we need disk space (around 15 Gigabytes). Because we are using a virtual machine it does not hurt to have a beefy computer. More RAM+CPU = Better.


There are several steps needed to complete our goal, most of them require some Linux usage. Bear with me, we will make it! (Also have a look at Linux for Dummies)

Install VMware Player (10 Minutes)

On your Windows Computer do:

We will install a Hyper Visor on our computer. A Hyper Visor enables you to have an operating system in your original operating system (without the risk to destroy your computer).

There are several Hyper Visor solution available. On Windows 8 and 10 the Microsoft HyperV software is available (and installed by default, I think?). In general I would go with the defaults, but I deactivated HyperV for various reasons in the past (no 3d virtualization being one of the most pressing reasons) and choose VMware Player.

After installation of VMware player download the Ubuntu 14.04 (64bit )LTS server image (or the desktop image, I don’t care). So far, 32 WONT WORK!

Create a virtual machine (5 Minutes)

On your Windows Computer do:

Start VMware player and hit “create a new virtual machine”. In the wizard window choose the “Installer disc image file (iso)” option and select your previously downloaded Ubuntu server image.

Hit next and go along with the defaults* until it finishes.

* If you are constrained on the disk size, you can reduce it from 20 Gb to 10Gb. I tried it with 5Gb but failed on downloading all the needed tools.

Install Ubuntu 14.04 LTS (30 Minutes)

In the virtual machine do:

Start your virtual machine (green arrow) and go along with the installation defaults. I changed some aspects which I will highlight now:

  • Language: I’m using German as system language and as keyboard layout.
  • Encrypt File System: I politely declined. I couldn’t care less on this build machine. Also this is in theory a minor performance drain.
  • Packages to install: I checked the OpenSSH box

Let the installation finish. It will take some time and ask questions in between instead on the beginning. So have an eye on that. As a side note. Remember username and password. We need it. For convenience use KeePass 2 or similar software.

Get IP Address (1 Minute)

In the virtual machine do:

Finally we can log in. For later usage we need the IP address. After you logged in, enter

in the terminal. This will bring up the network interface configuration. We need the IP address of the interface eth0. In my case the line looks like

We only need the address: in my case.

Install Nerves (10 Minutes + 45 Minutes of waiting…)

In the virtual machine do:

I installed everything in my home folder, but fell free to do so where ever you like (but remember it later on :-P)

Update your package sources:

The following code is provided by the Nerves-Project:

At this point the Nerves tutorial could be more on the point. They state “…  Change to the nerves-system-br directory…” yet there is no such folder. To fix this clone the Nerves-project/nerves-system-br project from Github!

Now we can change in the nerves-system-br folder:

Now we can follow the tutorial on the nerves-project Github page along again:

And now to kickstart the future “real” builds: Lets make a dry run to cache all files (this maywill take a while):

This takes around 45 Minutes.

Using Nerves (10 Minutes)

The tutorial states one has to source the environment. This sets some variables and is needed every time one brings up a new console

For testing I cloned the blinky example from Github

change in the new folder and in the blinky directory

Now the magic can happen:

This will create a file _images/blinky.fw

*.fw is the file format for fwup (don’t worry it is already installed). After some research (and lots of help by Frank Hunleth and Garth Hitchens I We figured out the correct command:

Using this command you create we create a Raspberry Pi image file (_images/blinky.img)

Get the image (5 Minutes)

On your Windows Computer do:

The resulting image is located in your virtual machine at ~/nerves-examples/blinky/_images/ .

To access that folder we need to download the folder from the virtual machine to your Windows computer. You can access the folder via WinSCP if you opted in for the OpenSSH package before:

Open WinSCP (if installed; otherwise: DownloadChoco Package) and create a new connection. Computer name is the IP address we saved before. Username and password are your credentials from the virtual machine (I told you to remember it!).

You will be presented with two explorer views. The left side is your computer, the right side is the virtual machine, already in the home folder of your user.

If you cloned the nerves-examples.git in the home folder as I suggested, you can easily follow the path. On the right side click through the folders ~/nerves-examples/blinky/_images/ until you find your file, which is called blinky.img. Copy the file to your computer, for example on the Desktop.

“Burn” the image (15 Minutes)

On your Windows Computer do:

To burn the image we need a tool which creates the file system from our downloaded image. Follow the tutorial on

Fun (Countless hours)

On your Raspberry Pi do:

Insert the SD-Card in your Raspberry Pi and power it on. Depending on your setup, something should happen! For example the blinky example lets you blink the LED, the demo example (should, didn’t work for me) let you connect via ethernet.

The fun you can have
The fun you can have


  • Update 23.12.2015: Added hint for 64bit after @mdsebald mentioned it
  • Update 22.12.2015: Fixed the “Get the image” part
  • Update 22.12.2015: Fixed the tutorial
  • Update 21.12.2015: Marked the tutorial as broken