Artikel - 18.7.2007 - 21.9.2015

Hero Forge Custom Miniatures has an online designer with various configuration options for game figures. Not exactly cheap, but if you really need a figure for a game, it might still be worth it.

After considerable reflection and discussion, we've decided the time has come for us to step aside and seek new owners for Koken. As much as we'd like to stay, all of us have decided to pursue new professional opportunities that will prevent us from…

Quelle: Koken is for sale and looking for a new home - Koken blog

We decided to do something totally new for us this year for our vacation: a tour through France by camping car. And it was a blast! Originally we only planned for northern France, but due to the weath…

Quelle: Tour de France

Magic for Fun

I myself enjoy playing Magic the Gathering - mainly EDH, as it usually allows for a nice game in a nice group, and is not so much focused on tournament and winning. At least in the right group. But then comes the problem - I can actually only play on MTGO (online), as I rarely make it to the local game meetup on time. Of course, you can just play with a few buddies, but then you need decks. But then there is the problem that several people play with foreign, unknown decks - and EDH decks do have a thematic or mechanical alignment that you need to know to use them meaningfully and well. This then favors the person who built and knows the decks.

Therefore, I have recently dealt with a few multiplayer variants of Magic that can be well set up as a spontaneous game round, but also satisfy my second passion - deck building.

There is of course the best known and probably most widespread - the Cube. Here, someone builds a large stack of cards into a Cube, from which drafts are drawn. There are several problems with this, however. The first problem: the Cube must be very well balanced, as you are doing a draft - i.e. randomly assembling cards into booster packs from which the players then build their decks. Essentially, you are putting the design into the Cube that you would also put into a Magic set. This can be a lot of fun, but it does require a lot of effort. Of course, you can get net lists of Cubes and get the cards - but usually you want your own Cube. The second, more serious problem for me: you are doing one of the Limited formats, Draft or Sealed. So mix the whole Cube (Problem 3 - how do you mix 500 cards?), pay attention to rarity (not necessarily identical to the original rarity, but rather a weighting in A/B/C cards), then assemble and prepare "Boosters" from it, then you can start with the game. Sorry, no, first comes the draft. And then the deck building. Oops - then you quickly lose the beginners, because casual players can maybe quickly learn how to play, but how to draft or build a deck, that's a bit more involved. So Cube is out for me.

Then I stumbled across the Stack Format and was immediately enthusiastic. I don't really want to design my own set - I just want to play with cool cards and do it in such a way that you can also include casual players after the latest open round (i.e. a round where all decks are open, so you can explain things and help each other). The design aspect is there - I can think about the cards that should be included, as much or as little as I want - and the setup when you want to play is also low. And through the automatic mana fixing in the format, beginners also have fun, because they usually get something to play that also has an impact. There are actually only two problems left: on the one hand, you have to mix 500 cards again and then draw extracts from which you then don't know how the game with the community deck will turn out, and on the other hand, despite mana fixing, there are problems in creating a consistent game - it is simply much more random. Deck design is almost non-existent, there is not even a preliminary selection of colors.

Another casual format from the Stack family is Big Box Magic (or Commie Box Magic as it is now called). There is also a common library, but divided by colors. And the player builds his color identity through the selection of his 12 lands beforehand. What I like about this: I can at least give a rough alignment, because the colors do play quite differently. And through the land selection, I already have a rough idea of how I want to play - unfortunately, this can quickly come into conflict with the reality of the drawn cards. I bet on blue-white with green splashes, but get only 3-green fat guys at green, it quickly becomes sad.

In addition, both formats also have a problem: it is not even remotely something like Commander. Yes, they are Singleton formats (at least if the Stack/Cube designer wants that), but there are no Commanders. And that led me to design my own variant, a mix of Commander and the two Stack formats. My attempt to build a common basis from the ideas with which you can play Magic more as a social game spontaneously, without much setup, with design options for the players and with deck building options for me. And without having to mix 500 cards (at least at once).

The basic idea is a common library like in Commie Box Magic (i.e. the 5 colors as their own stacks - and in my case really only monochromatic cards) and a common graveyard. The graveyard is divided by colors just like the library, and there is a 6th stack with artifacts and non-standard lands and a corresponding graveyard. All libraries and graveyards are common property - but always filtered according to the color identity of the player concerned. I decided to select 70 cards per color and lay out 30 standard lands for each color. With many players, you should probably lay out a larger land stack, otherwise it is too quickly empty.

Color identity is derived from the selection of the Commander by the players. I have a set of dragons - the Shard and enemy color dragons, of which the two with the same primary color are always bundled. That's enough for up to 5 players - each can choose a primary color and then their command dragon. If you want more than 5 players, I have a set of Guild Commanders, two per Guild. That goes up to 10 players, should be completely sufficient for my purposes. The Guild Commanders are also suitable if several players want the same primary colors and there is no agreement on the distribution of the dragons. Or if you just want to play with two instead of three colors - this does limit a bit, but has the advantage that players can develop a feel for the two colors in focus faster, simply because the number of cards is smaller. The dragon variant is also ideal for 2HG game variants - the two players of a team take the dragons with the same primary color, the team covers all 5 colors with this, but still has synergies. With this, up to 5 teams of two can be set up.

The game is then first designed from the selection and distribution of the commanders. Then each player draws his hand from the library - of course only from the color-matched stacks. To do this, each player can draw 3 or 4 cards from the actual card stacks and then fill his hand to 7 from the lands. Drawing is done in turn, since the library stacks are used jointly - so player one draws the first card, player two the first, until everyone has the first - then the round for the second card and so on. This way, each player can steer which color (or whether he needs artifacts) he wants to draw and also when and how many lands and which lands he wants. This should give everyone a playable starting hand, without major mulligan rules.

The game itself is designed just like normal EDH - you start with 40 life, there is commander damage from 21, poison counters from 10 and so on. Everything as usual. Special features only when drawing cards, with all cards that reference the library and with all cards that reference the graveyard. There is one more special feature with the commander - there is no tuck. The commander always goes to the command zone, never to the library - otherwise someone else could draw my commander and not be able to play it due to lack of matching mana.

Card drawing comes in two variants: the standard drawing of cards and the drawing of cards triggered by cards or abilities. With normal drawing in the card drawing phase, the player can choose from which part of the library he wants to draw. Here, all colors of his color identity and the artifact stack are available. A player can either draw a card in his color identity, a land in his identity, or a colorless card. Additional drawing of cards happens the same way, except that the artifact stack is not available. The reason is simple: all players share the artifact stack and it would simply be too quickly empty if you could always draw from it, artifacts are after all often the "jokers" in Magic. If a stack is empty, you can no longer draw from it - since each player has at least 300 cards to draw from through his colors (two color stacks, two standard land stacks and the artifact stack in the Guild variant, 400 cards in the Dragon variant), you should always be able to draw something. Whoever can no longer draw anything first also loses as in normal EDH.

What about cards that refer to the library? For example, with tutors (I myself avoid them in my stack) or those that go to the top cards of the library? These behave just like additional drawing of cards, but without lands - so directly to the colored stacks of your own color identity. In addition, the player must choose a color of the color identity before triggering the action - Scry 3 therefore only looks into the stack of one color (because only non-lands are in the stack in question, distributing over several stacks would then be too powerful). The stack designer should avoid cards that, for example, refer to lands or non-lands - in a selected stack there are always only non-lands. Some cards are therefore simply not compatible with this format. What about the opponent's library? The same - only a color of the opponent's color identity is chosen. And yes, your own library and an opponent's library can overlap. Top manipulation therefore often has more the character of "taking something away from an opponent" than the character of "preparing something for the next turn" - Hellsight is therefore rather an offensive tool. After all, every opponent with overlapping color identity comes into the potential "enjoyment" of my prepared card. This opens up completely new political strategies!

And the graveyard? Everything is in there together. Simple solution: the graveyard is sorted by colors just like the library. Cards that go to the graveyard are placed on the stack of their color. My graveyard is then all graveyard stacks from my color identity - and in addition the artifact graveyard. An opponent's graveyard is all stacks in his color identity - and in addition the artifact graveyard. Reanimation can therefore without further ado bring back cards that originally belonged to someone else! Even if the number of creatures in the graveyards of your opponents is asked - simply take the common color identity of the opponents and then count the creatures in the matching graveyard stacks and in the artifact stack. Yes, the Stare from the 6th district can thus turn out to be quite monstrous!

The designer of the stack should of course design according to the format - for example, mill phrases like the Archive Trap make absolutely no sense, as it is not clear whose cards you are milling away due to the divided libraries. Small mill effects are ok, but they don't really make sense either - it's more that you accept them because you want the card in there for other reasons. In general, you should avoid them. The same goes for cards that talk about owners in graveyards or libraries or exiles (which is also divided, but since you don't often access it, that part is rather irrelevant) - "bring all creatures in graveyards under the control of their owners into play" is pretty stupid here, because who is the owner of jointly used graveyard stacks? So when designing, always pay attention that the cards can also be played here without conflicts or too much head scratching.

In general, I have set the motto "Good-Stuff-Deck with Theros-God as an idea" as a basis for each color stack. This gives the color stacks a bit of character and the players can decide through the color choice in which direction their game should go. For the artifacts, I focused on the colorless helper lands and otherwise of course a lot on equipment and helper artifacts with low costs. Also because of the above rule with access to the artifact library only at the time of the primary card drawing - otherwise a Nin-player quickly grabs all the weapons ...

A drawback of the format is the poor support for gold cards - so far only as commanders, otherwise only monochromatic cards. An idea would be to give the commander - at least in the Guild version - a private library of gold cards, which is then only controlled by this one player. This makes it a bit more complicated, but also offers the opportunity to design a real Guild identity and thus give the Commander choice even more meaning. However, you quickly get into the situation from above that casual players are overwhelmed with the evaluation of the private libraries and more experienced players are clearly favored. I'll have to test a bit first before I decide on that - best first without this Guild library. Also saves time and cards if I do that first.

The nice thing about the format: the setup is as simple/complex as with one of the more complicated social games like Settlers - 5 boxes with 70 cards and 30 lands, one box per color. Separate the lands and lay them down, mix 70 cards and lay them down. Then mix the 100-box with the artifacts and colorless lands and lay them down. Commanders on the table for selection, and then you can go.

Magic plays its advantage here that cards can often be well understood when you just read the text - I have therefore also paid attention to only use German cards. Maybe I will also go through the selection again and sort out overly complicated cards (or cards with keyword properties without explanatory text) and replace them with simpler cards. Even if I then have to sacrifice some beloved cards. I have already only one Planeswalker per color in there, for example.

Whether the format works will of course only be shown by game tests. I will report whether top or flop.

Nice on the outside, nasty on the inside

It seems to be the motto of Koken, which I once praised quite a bit here. It is still one of the best-looking gallery systems with a really sleek admin interface. But all that sleekness cannot hide the fact that the code underneath is probably not as sleek as the layout after some problems that were almost not debuggable. To this day, the login form has problems with Chrome and Safari - and they haven't solved the problem. From the reactions, it's not even clear if they care at all. How stupid is it when a login form doesn't work because of some JavaScript hacks under Chrome? It's a simple form with username and password, what's so big about JavaScript there?

The crown was the Lightroom plugin again today. I used Koken because my old blog workflow - thanks to the stupid decision of the WP programmers to scrap the entire Atom publishing and let it rot instead of fixing it - went down the drain. I don't want anything complicated - just a simple way to upload a stack of pictures online with one click directly from Lightroom. It also works if the target is Flickr or Picasa or one of the other supported online services. But I want to control my pictures myself - and host them myself. Well, if a plugin just creates an empty album but doesn't upload a single picture (even though it fiddles around for hours and pretends to be active), then it's all just for the worse. If there are no logs or debugging possibilities or messages anywhere that help in the analysis, then it's all just for the trash.

So, for the time being, pictures will probably end up here again, currently with Dossier de Presse, a plugin for Lightroom that publishes via XMLRPC to WordPress. Which is not optimal, but the only thing currently available that works with current WordPress, current Lightroom, and at all. Since no metadata is transferred via XMLRPC in WordPress, such as image titles, I will probably have to remove this from my gallery layout somehow, otherwise only strange technical image names will appear there. Titling pictures is stupid anyway.

Computers could be so much more fun if software wasn't programmed by amateurs and blockheads 90% of the time ...

Gumbo

Today something exotic - Gumbo. And then also a bit modified, because I felt like it. In principle, it's quite simple: Bacon cubes (250g), onions (we had two onions), garlic cloves (don't be stingy with them!) and a carrot cut into slices. Fry bacon in the pot, add meat (we had 300g beef goulash), fry well, add onions, garlic, carrot, and let it simmer. Then deglaze with a splash of red wine (not necessary, but I think a bit of red wine always belongs with goulash, even if it doesn't become goulash), add 1l of broth, bay leaf, ginger, and cumin as spices, let it cook for about an hour. Then prepare rice and while the rice is cooking, dice a bell pepper and add it to the soup, and also add a glass of okra pods (drain and rinse before using), let it cook for about 10 minutes (until the rice is ready). We also added some capers (soak them well and for a long time before using, otherwise they are too salty). Serve rice on the plate, pour stew over it. We had wine spritzer with it. So the ingredients summarized:

  • 250g diced bacon
  • 300g beef goulash
  • 1l beef broth
  • 1 carrot
  • 2 onions
  • 4-5 garlic cloves
  • 1 bell pepper
  • 1 glass of okra pods (approx. 400g, drain and rinse before use)
  • Ginger powder, cumin, salt, pepper
  • a few capers (soak well before use)

Hyper-V, Virtual Machines, Drive Letters, Madness, Microsoft

Yupp, the above combination is really not great. Scenario: Hyper-V machine, several virtual machines, some with snapshots, various very long-running installations and a lot of work in these machines. New machines are created based on existing images, which are each generalized with sysprep and prepared for first use and then configured.

Enter the system administrator: a new virtual machine created, sysprep running, unfortunately not in the virtual machine, but on the Hyper-V server. It was then gone. First panic attack.

Colleague has revived the (of course remote) Hyper-V server and put it back into the domain, I get on. All configurations still there, all virtual machines still there. Not a single one of them works. Second panic attack.

Trying to edit virtual machines, no go - the configurations are not accessible, Hyper-V thinks they are all on drive C:. Checked, oh, the drives I: and J: (where the machines were before) are no longer there, have different letters. Ok, letters reversed and Hyper-V restarted. None of the machines run, they still think they are on C:. Third panic attack, as I realize that no configuration changes can be made.

Well, even in the configurations and the registry there is nothing about this mysterious C: - where does it come from? After a long search, found, for each virtual machine and for each snapshot Hyper-V places symbolic links under NTFS. These are located under %systemdrive%:\ProgramData\Microsoft\Windows\Hyper-V in the subdirectories "Virtual Machines" and "Snapshots" and point to the real target files. And in a magical way, all of these pointed to C: - apparently "corrected" NTFS at startup defective symbolic links that point to non-existent drives. Great.

So the links were recreated (first only an unimportant server, so I can see if it works). Of course it doesn't work, because Hyper-V ignores the nice new symbolic link. Permissions are wrong. Icacls can fix that - "NT VIRTUAL MACHINE\" is the syntax for the pseudo-user that needs to be assigned. But it doesn't work. Fourth panic attack.

Found, while swearing, that a Frenchman also had problems with this - Microsoft in its great wisdom has localized the names. Under the German version, therefore "NT VIRTUELLER COMPUTER\". Great. Really great. It only cost me 2 hours of my life.

Pork tenderloin in mustard sauce

Pork tenderloin in mustard sauce This time it's quite simple and straightforward, so here's just a rough outline of how to prepare it. Two weeks later, I modified the recipe a bit, adding some extra spices (ginger, hot paprika, and coriander) to the sauce, which Juliana liked even better - a nice spicy sauce. So here's the preparation:

  • Cut the pork tenderloin (previously remove the skin and tendons) into 2-3 cm thick slices
  • Finely chop one onion and one large clove of garlic
  • Quickly and sharply sear the tenderloin pieces
  • Remove the tenderloin pieces and place them in a baking dish (or as I did, in an oven-safe pan)
  • SautĂ© the onions and garlic
  • Deglaze with white wine
  • Pour half a pot of cooking cream over it
  • Stir in 2-3 large teaspoons of mustard
  • Season with pepper and salt, add ginger, coriander, and hot paprika to taste
  • Pour the sauce over the tenderloin pieces in the baking dish
  • Bake in the oven preheated to 200 degrees for about 35 minutes

We had bread and salad with it. Delicious and not much effort.

Spaghetti Carbonara with Vegetables and Meatballs

Juliana found the recipe and it sounded good, so we decided to try it today. It all worked out quite well and tasted good too. This time, the quantity (except for the meatballs) was just right for two people. Here's what goes in:

  • Spaghetti
  • Ground beef 300g
  • Ham (about 4-5 slices)
  • Zucchini (1-2)
  • Mushrooms (about 2 handfuls)
  • Parsley
  • Garlic, marjoram, sweet paprika, rosemary, pepper, salt (all mainly for the ground beef)
  • Some breadcrumbs
  • 5 eggs (1 in the ground beef and 4 in the sauce)
  • Parmesan (about 40-50g we had)
  • Cooking cream (the one with 15% fat, about half a cup)

Preparation was a bit more complicated because you had to do several things at the same time - and of course we started the pasta too late and therefore had to keep everything warm in between. But otherwise it was actually quite straightforward:

  1. Chop the garlic finely
  2. Mix the ground beef with garlic, breadcrumbs, one egg and the spices
  3. Form small meatballs and fry them in the pan
  4. While they are frying, chop the vegetables
  5. Store the finished meatballs somewhere in between
  6. Fry the vegetables in the pan and add the ham at the end and fry with it (not too early, otherwise the ham will become too dry)
  7. In parallel, whisk together 4 egg yolks, the grated parmesan, the cooking cream and the parsley and heat - do not boil, just heat
  8. The sauce will become a bit thicker when it is finished.
  9. Maybe add some salt to the sauce.
  10. Don't forget to start the pasta in between

When everything is ready, simply put the pasta on the plate, distribute the sauce over it, then the vegetables with the ham on top and serve. Maybe sprinkle some more parmesan on top (we didn't have any left) and add the meatballs (or just eat them separately - the sauce is quite heavy, the meatballs were almost too much of a good thing).

Goulash in the Hungarian Style

Juliana had bought pork goulash but had no idea what to do with it - and then I suggested Hungarian goulash. After that, I had to go into the kitchen and cook. Goulash is quite simple, here just as a reminder for myself blogged:

  • 300g pork goulash
  • 1 onion
  • 2-3 cloves of garlic
  • 1 bell pepper
  • 1-2 tomatoes
  • 100ml red wine
  • 400ml water (colleague suggested using broth instead, gives more flavor)
  • Paprika powder (sweet and rose sharp)
  • 3 cloves, 3 bay leaves, 4 pimento seeds
  • Peppercorns, 1 tsp salt
  • Coriander, rosemary, thyme, marjoram, basil (or whatever spices you can think of, but these are relatively classic for it)
  • some tomato paste (mainly to balance the currently rather flavorless tomatoes)
  • 1/3 cup cream

The preparation is as follows:

  1. dice the bell pepper, tomatoes, onion, and garlic cloves (put on separate plates)
  2. sauté the onion and garlic cloves in a pot until translucent
  3. put the meat in the pot and sear it well
  4. add paprika powder (season generously!) to the meat and onions, stir quickly
  5. deglaze with red wine
  6. fill up with water
  7. put the diced bell pepper into the pot
  8. put in all the spices (especially cloves, bay leaves, pimento seeds, and peppercorns must simmer with the whole meat and broth, otherwise they do nothing)
  9. let everything simmer together for 45 minutes
  10. put in the tomatoes, let it simmer for 20 minutes
  11. add some tomato paste, simmer for another 10 minutes
  12. carefully stir in the cream just before the end

Goes well with pasta or maybe rice (but I prefer pasta). Or just with bread and some vegetables.

Piroshki like Grandma's

The first time I made them with my mother-in-law - I made the dough, she seasoned the ground meat and shaped the piroshki. This time I had to do everything myself. The structure is actually quite simple.

You start with a classic yeast dough. For this, I put the following together in a mixing bowl and knead it thoroughly:

  • 500g flour
  • 200ml milk (room temperature)
  • 2 packets of dry yeast
  • 125g margarine (warm)
  • 80g sugar
  • 1 egg
  • 1/2 teaspoon salt

Knead everything thoroughly and let it rise in a warm place for 30 minutes - it should clearly increase in volume. I usually use the oven preheated to 50 degrees with a damp towel over the dough bowl, which provides exactly the right humid-warm climate. Tip: first mix the dry yeast into the flour, then add the other ingredients. This prevents yeast lumps and results in a more even dough. Instead of two packets of dry yeast, you can also use a cube of fresh yeast.

The filling is also not complicated, simply mix 500g minced meat with two bricks (chopped finely) and pepper, salt and other spices to taste. That's it for the filling.

Then knead the risen dough again by hand and form small balls - about the size of table tennis balls. Roll them out with a rolling pin to slightly larger than coaster size. Then place the minced meat on the dough rounds and form crescents (fold together). Seal the edges well. That was the complicated part, finding the right size for the dough balls, the right thickness when rolling out, and the right amount of minced meat as filling. This was still a bit rough this time, we just need to practice more. Since we had some dough left at the end of the minced meat, we filled the rest with jam. Tip: choose a jam with a strong flavor. Oh, and really press the edges together well - even with the sweet piroshki, a few opened up during baking.

Place all piroshki on a baking tray with parchment paper, brush with a beaten egg and bake in the oven at 180 degrees. With the quantity above, it will be two trays: bake the two trays for 20 minutes, then swap the positions of the two trays and bake for another 15 minutes. With only one baking tray, 30 minutes should be enough - just check, when the dough is nicely brown, it's done. This makes a large bowl full.

We got married

Instead of including the pictures here, there is only a reference to an album of pictures taken by my mother-in-law - the official photos are still pending (and I didn't take any myself).

FLASH PLAYER NOT SUPPORTED

Wildgulaschtopf

I came across wild goulash on the market and grabbed it. And then I bought fennel and green onions. Well, I did a bit of googling and got an idea for a recipe. As usual, of course, slightly adjusted. What goes in:

  • 300g wild goulash
  • 3 medium potatoes
  • 1 large carrot
  • 1/2 fennel bulb
  • some green onions (I took two stalks)
  • 250 ml wild stock
  • 125 ml red wine
  • rosemary, thyme, peppercorns, 2 cloves, 1 bay leaf

Preparation is surprisingly simple, though a bit time-consuming, so not exactly cooking when you're already hungry:

  1. Brown the wild goulash in a pan with olive oil (high edge, a lot will go in!) until it gets color all around
  2. Deglaze with red wine and top up with wild stock
  3. Add the spices
  4. Let it simmer for 60 minutes
  5. Meanwhile, chop the potatoes, carrot, fennel, green onions
  6. Add the vegetables to the pan after the meat has cooked until tender - if the liquid level is too low, simply add some wild stock (or neutral vegetable broth if the flavor becomes too intense) to top up.
  7. Let everything cook for another 25 minutes
  8. Season with pepper and salt, possibly add some nutmeg and cumin (we love cumin!)
  9. Sprinkle with chopped parsley and serve

I prepared everything first, so that we can just reheat it later. To go with it - I have a Russian at home - bread.

Phone rates can really ruin the fun

When you look around at phone tariffs, you can indeed get minor to major fits of rage. The telephony components are now somewhat okay, the SMS rates are still ridiculous to absurd - there are no more expensive ways to send your data in 140-character packages. But okay, that's nothing new. But when you look at the data rates, you really start screaming.

The reason why I subjected myself to this madness: SMS from T-Mobile that I had used up my full-speed volume and now for the rest of the month I have to live with 64kbit downstream and 16kbit upstream. Checked in my iPhone under data usage: 1.1G downstream and 430 MB upstream. Unfortunately, but in a period of almost one year. How I suddenly should have used 200 MB in the first 8 days of this month was a mystery to me until I remembered that the providers conveniently bill started 100kbit chunks. So that the full-speed volume is used up as quickly as possible. Thanks, push notifications.

Looking at the three big ones (T-Mobile, Vodafone, and O2), you first see nice overviews with prices. And of course flat. Today everything is flat. But flat was probably only the mind of the marketing guy who came up with this nonsense. Although the volume is actually unlimited, but of course only in the fine print it says from which volume you are reduced to ridiculous 64kbit - and that is only the download, the upload is then reduced to 16kbit, almost unusable.

To O2's credit: if you click on the right paths, you get a relatively clear view of the throttling stage there. So not under the mobile tariffs with the smartphone specification, but via the internet and then surfing with a mobile phone. Why one is clear and the other is not, only the web designers know. Or the price hiders. Possibly, the other providers also have an emergency page where you get a reasonable overview, but at some point, I didn't feel like looking anymore.

The fine print is incidentally only with the Telekom referenced with numbers in the tariff - and already displayed unfolded at the bottom. With O2 and Vodafone, you first have to think that something could be hidden under "further legal notices" or "further notes," without being pointed out. Why bother, it's insignificant, it's all flat. Oh, and of course pale-gray font and only 10 points high, it shouldn't be too easy to read. For me, this borders on fraud.

Apart from the hidden placement: the normally affordable tariffs (sorry, but tariffs over 50 euros a month are simply an audacity but not an offer) have ridiculous 300 MB volume until the shutdown. Oh, sorry, Vodafone only has 200 MB ...

Then there are the funny ideas about contract bindings. Yes, I can understand the 2-year binding if you take a contract with a device - after all, the device has to be financed over it and I don't expect gifts. But the then casual extension by one year if you don't cancel at least 3 months before the end of the forced period, that is really cheeky.

Especially when you look at the budget brands of the big providers: Base, Fonic, Congstar. Strangely enough, you can see directly on the tariff overview which variants of throttling there are. In addition, there are several variants. And there are significantly clearer prices. Only strange - they run over the networks of the mothers. I don't have to mention that the budget brands have more moderate contract bindings, do I? Of course, the budget brands are not good either - there is not even the claim anymore that you would get service (which the big ones don't really deliver either - those are rather acts of desperation than service).

It's strange that the same service can be offered at drastically different prices, and the budget brand still makes a profit. Could this have something to do with the fact that the mother brand simply sells things at moon prices? Oh, and it is of course pure coincidence that they all have almost the same prices in their respective segments. I mean, this is a well-regulated market, there are certainly no agreements or anything like that. How can one even think of that ...

The enemy of mobile internet, the stumbling block of the development of this sector? The absurd ideas of mobile phone providers. It's time for alternative radio technologies that can be provided by providers outside this inbred bunch of purse snatchers. But hoping for that is probably also absurd, the telecommunications lobby will take care that the market is not accidentally opened.

PS: yes, I know that Base is not Vodafone's budget brand but E+'s. Or uses the E+ network. Does Vodafone even have something like a budget brand?

Social Networks and my use of them

At some point, you realize that you simply have too many places on the web where things happen and where you are somehow represented. For this reason, here is a list of the current networks in which I appear and how I actually use them (or plan to use them - not everything I want to do always works out).

  • my digital home is here in the Blog. Everything possible lands here, links, images, articles, etc. If something is important enough for me to want to say something about it, I put it here - because only here do I actually have full control over my content. Some of my content is also posted to Metaowl - our small privacy aggregator.
  • I also have a small Tumblelog on Tumblr. I actually only use this to post photos or occasionally comments, etc., when I'm on the go. So it is primarily fed by my iPhone.
  • On Twitter is what Dave Winer calls the "Firehose". So if you want to know what I'm doing in general, wherever, you can follow my stream there. Be careful, it's really a lot - because every new service that gives me access to my activities there in any way is hooked up.
  • On Flickr I have a number of photos, and I will certainly also post pictures there from time to time. Simply because sometimes I also want networking for the pictures - usually the same pictures then also end up here in the blog.
  • As a social network, I primarily use Google+ at the moment. Of course, it still has some problems, but the architecture appeals to me much better than in the other networks. And the integration with Picasa of course also leads to the fact that I post pictures there.
  • As social code networks, I currently use Bitbucket and Github. My open source projects land there (although I of course also have copies locally in case these hosts disappear one day). However, I currently mainly use Github to follow projects there, my own projects use Mercurial and therefore end up at Bitbucket.
  • Facebook I actually don't use at all. I send some content (blog posts and, for example, Google+ updates) there, but otherwise I'm just there as a hook and for people who don't want to use my RSS feed but want to follow my content there because they are already there.

So, everyone should know where I hang out and where I appear. And now please no complaining "your Twitter stream is too full" - that's intentional. And also no "check your Facebook" - that only happens sporadically.

EVIL is King

evil-sony No, this is not the worship of evil, but simply my thoughts on camera technology, which has undergone the greatest changes in recent times and from which the most new systems have emerged - EVIL cameras, i.e. cameras with Electronic Viewfinder and Interchangeable Lenses. Whether it's the Micro-4/3 system from Panasonic or Olympus, or the NEX system from Sony, or one of the other new systems in the category. I am always amazed how much this has changed photography for me - simply through the availability of really compact cameras that I actually have with me.

Until a few years ago, I still had a Canon EOS 10D - a really great device with good image quality and the performance of the usual digital SLRs. Ok, towards the end it was no longer up to date, there were already several new generations (when I sold it, the 50D had just come out), but that didn't matter to me - the image quality left nothing to be desired.

But the size! The camera itself was not exactly small (though not as big a monster as the usual professional devices), but the lenses were really heavy. I only had a few lenses - mainly the 35/1.4L and the 100/2.8 Macro (my autocorrect just wanted to correct that to Macho - considering the weight, perhaps not entirely wrong). Take the camera with both lenses? Uh, no thanks, rather not. And when I had them with me, it was just a camera with quite classic functions. Video, just because you see something funny that you want to capture as a little film? No.

Ok, today many DSLRs also have live view and filming, but they have not become smaller - they are still quite large devices due to the principle. The small EVIL systems, on the other hand, save a lot of space just by the missing mirror box. And this applies not only to the housing - optics can often be built more compact with the same image circle if the distance to the chip plane is smaller. And then the possibility of adaptation! I can adapt almost everything on my NEX-3 that I still have lying around at home in terms of old optics. Very pleasant, because even the optics of my old Contax RTS III were significantly more compact than current DSLR lenses, and if I want to be really compact, I can put the Leica lenses on it.

What has changed for me as a result: I have the reflex "throw the camera in the backpack" much more often than before - the NEX-3 was not expensive, so you don't think much about the transport, in the backpack and good. It doesn't produce much weight with the standard zoom and you have almost everything you need. The image quality itself always surprises me - and in a positive way. I hardly believe that my 10D was really better (on the contrary, the sufficient quality up to 3200 ISO on the NEX makes shots possible that I would not have even thought about before). And the focus-peek on the NEX is better with manual lenses than the focusing aids of old manual SLRs (at least with my eye values).

All in all, the investment in an EVIL system (ok, in my case actually in two systems, since I also have the Panasonic equipment) was a real gain. Even if most of my photo friends would probably wrinkle their noses at the family photos and pure snapshots that I take - hey, I'm getting married soon, I have to get used to family pictures, the time of wild art experiments (which were not really better) is over.

In the end, only one thing is important to me: that photography is fun for me. And that is the case again with the new compact cameras. And what more do you want.

MonoMac and XCode 4

Ouch, that really got me - I wanted to play with the GUI tools, but MonoMac doesn't fully support XCode 4 yet and only has this to say:

Developers that use Interface Builder are recommended to install Xcode 3.2.6 for the time being. MonoTouch News.

The bad part: that's from March. That's way too long without support, so MonoMac is just half a tool (if at all). Running XCode 3 and XCode 4 in parallel is also not an option (both want to be in the same folder). De facto, you're either stuck with XCode 3, or MonoMac is out. Or you build your GUI with GTK# - then XCode doesn't matter (the compiler part of XCode 4 is supported), but then you can forget about the AppStore (and GTK# doesn't really look great). Damn.

Data Protection and Social Network Buttons

Just read: Data Protection & Facebook Like Button for Website Operators. I just played around with the various social buttons (they might still appear on cached pages for a while), but then I thought they will probably report more connections between accounts and page visits - and turned them off for now.

The linked article takes a closer look at the Facebook Like Button, the Google +1 Button should be very similar. The Twitter Button is probably not unproblematic either, at least when the server is queried for the number of tweets - but an unproblematic variant of the Twitter button should be achievable, because that is the simplest case after all.

Well, for now I have turned off the toy again, I still have to think about it. Because on the one hand it is of course interesting to enable visitors with active social networks to easily share in their networks - but what is the price?

Thoughts on Data Privacy, Loss of Control, and Other Things

From the Spackeria, from tin foil hats and from loss of control - The wonderful world of Isotopp. Worthwhile consideration of data protection, data traces, the inevitable accumulation of data volumes and the inevitability of the accessibility and evaluation of this data.

I myself am always sitting between the chairs of the tin foil hats and Spackeria - on the one hand, I want data avoidance and have my problems with the data collection mania in some places, on the other hand, I am close to technology and enthusiastic about it and am therefore automatically collected in many data pots. And I am absolutely aware of how much can be found out about me online if someone puts it all together.

I see, just like Isotopp presents it in the article, a massive (probably inevitable) failure of legal data protection - but I myself see a certain differentiation between data that arises in the voluntary context of the use of technical services (even if the user may not directly notice these data) and data that are collected in the state context.

The state sets up data silos only under the negative aspect - a state-created database is always designed under the aspect of general suspicion. The state does not collect the data of persons involved in visa procedures in order to provide them with targeted information and services related to visa procedures - the sole purpose is law enforcement. However, this automatically suspects all persons involved in visa procedures of terrorism and other crimes - because otherwise one would not need to record their data. The executive of the state hates the disorderly citizens and deeply distrusts them, therefore they must be controlled.

A private economic pile of data has a much more banal goal - market economic exploitation. This is, as crazy as it may sound to some, much more preferable to me. Google will not use the data to negatively interpret my political beliefs and put me on a no-fly list because I criticize the state - they just want to show me better-placed advertising. In a certain way, one can rely on the reduced field of vision of capitalists, it is much more positively influenced than that of politicians. Data sets are potential businesses - not potential attackers.

For this reason, I find the current activity of various state data protection officers in the private economic or even directly private sector (warning blogs for the use of Google Analytics) laughable to embarrassing, if the same data protection officers do not stand up to projects like those that are coming up in the Interior Ministry or other state authorities.

How can a data protection officer expect to be taken seriously if he loudly complains about the location data falling off the iPhone and rails against Apple, but at the same time does not make any attempts to stop this crazy EU commissioner who has plans for EU-wide, suspicionless data retention that would violate our Basic Law (just as the data retention already failed in Karlsruhe that was spied out in Berlin)?

iPhone Location Data Again

Once again regarding the Apple response to the motion profile allegations and why Apple is right, but there is still a problem (but one that is significantly smaller than the dramatized problem in the press).

Apple produces a database with - anonymously collected, there are no indications so far that it is not anonymous - position data of iPhones with activated GPS, in which positions of networks are stored. Networks in this context are radio masts for GSM, 3G and WLANs that the iPhone sees at that time. However, this is not what is stored in the database that everyone is talking about. This is only the basis on which something is built that then ends up in the database.

The data sent to Apple is averaged internally and a "center" is determined for the networks reported by various iPhones (since the exact position of WLAN routers or radio masts is not simply provided - this must first be determined in some way). This data is stored in a large database at Apple. The position data therefore refers to the center of radio identifications. The original position data is only basic material for the determined position data.

The iPhone can now determine an approximate position via the visible radio identifications and their position information and a weighted average of the data based on transmission strength - but internet access is required for this. And internet access to the database at Apple. Therefore, the iPhone downloads the information about radio identifications and caches this locally. But of course not the entire database - that would be too much. Rather, a relevant excerpt determined by algorithms. This is now the database on the iPhone.

Apparently, Apple not only downloads the networks that the iPhone currently sees, but also neighboring networks - which makes sense, as the user moves around more often and the data from neighboring networks will be needed (potentially - the iPhone does not know in advance where I am going). Presumably, the iPhone will say "I see networks A, B, C" and the database will then provide "here are the networks A-M from the metropolitan area where you are located". The iPhone then takes X% of A, Y% of B and Z% of C as a basis and calculates a rough position and says "here I am". If it then moves into the visibility of network D, its position is already known and the iPhone can perform the position calculation directly without downloading.

In addition, the iPhone seems to store a temporal history of these downloads - presumably the developer assumed that if the user has been there before, there is a high chance that he will go there again. For this purpose, the iPhone keeps these data ready for one year. The claim by Apple that the duration of storage is a bug is certainly rather an embellishment - presumably a developer simply made up a duration and used it without considering how much would really be sensible - after all, these were not special data in his understanding. Only technical caches for downloads that he anyway makes when the user asks for his position.

What does this mean for the user? The data does not reproduce where he was in the coordinates - it only reproduces where the radio identifications are, in whose vicinity he was approximately. And since it also contains neighboring networks, this is really very approximate. Of course, a rough spatial profile of the user can be derived from this - for example, in my data I can indeed see that I have been in Amsterdam, in Frankfurt and in Berlin.

But for example, it also means in reverse that only the approximate regions are included if you also had network reception there, with download options. I was in Copenhagen - there I also had network access via the hotel, so traces of this are present. In Malmö and at the turn of the year in Russia I did not have network access - so GSM, but no internet access - and therefore the iPhone could not access these location data and could not download radio identifications with positions. Therefore, these data are also completely missing from my iPhone and there are no traces of Malmö, Ekaterinburg or Nischni Tagil (the same should apply if you have activated airplane mode or simply turn off WLAN and mobile data).

Furthermore, the spaces should become larger when you come to more rural regions - few WLANs, so mainly GSM cells and these with a larger range and more scattered. If you store a cell with the neighbors, this is already a fairly large area that is covered. In large cities, on the other hand, the covered area should be significantly smaller, simply because WLANs have significantly smaller ranges and there are more of them there. And radio cells there are also usually smaller (just because a cell can only cover a finite number of users, but the user density in cities is greater).

This is particularly interesting for programmers: do you think about what can be derived from cached data when you program? Take as a basis for consideration that someone has access to your DNS cache - which every system has internally, simply to reduce DNS queries. What picture of you as an image could this technically harmless information produce? These are the small pitfalls that programmers like to stumble over. It is actually harmless - auxiliary data that you get from the network is the beginning. Throwing away after use - well, if they are needed again, then it makes sense to have the most frequent ones ready, or? And it is exactly then that you run into problems like Apple currently has.

The discussion about why your browser cache contains porn pictures (because you read your mails with Outlook, for example, and opened a spam mail and had image display activated - not an outlandish situation!), if your wife finds them there, could already become quite interesting. The data no longer shows why they ended up where they ended up.

As stated in the title: I am referring here to the answer from Apple and have only checked this with my own data. My own data matches the information from Apple's statement and this statement itself is also consistent - both the contents and the specification of the use match quite well. I therefore see no reason why I should distrust the statement.

Apple's answer that the iPhone does not record the user's motion profile is therefore correct - it simply stores information for a position determination as an alternative to GPS. At the same time, however, it is at least a profile of the stay in large areas. Criticism is therefore quite appropriate. But in my opinion, it should be more intelligent than "Apple stores the user's positions in the last year", because this is simply wrong.

But as Apple says in the introduction to the answer: these are technical relationships that are more complicated than simply "does Apple store a motion profile Yes/No". And our press has massive problems with questions to which an answer contains more than two sentences. "Apple stores data from which the presence in large areas can be derived" does not sound so great and catchy as a headline.

Unfortunately, this very imprecise reporting can lead to problems arising - if I know that the data only covers regions where I have been, but not precise points of my stay, the explanation why my data from Frankfurt also includes the red light district (it's just near the train station) is much easier than if I have to assume that these are all places where I have been.

Apple must (and will, according to its own explanation) improve this - caching data for a year is nonsense. Backing up the data is also nonsense, they can simply be downloaded again if they are missing. Similarly, the data does not need to be stored if all location services are globally deactivated. It might also be generally interesting to have a switch "Pseudo-GPS Yes/No" or something like that, with which this type of position determination can be deactivated - then the user simply has to wait until the GPS satellites are logged in. Just as, in my opinion, the anonymous data collection for WLAN and radio masts should be switchable.

In my opinion, no cache should exist without a control function for this cache (just as you can also empty the browser cache). Because one thing must be clear: due to the general necessity of linking access time and loaded data (because only in this way can a cache with temporary storage function), every type of cache provides a kind of user profile. And this should be at least rudimentarily controllable by the user (in the sense of deleting). Setting up caches fundamentally with a clear function and a UI for this should become just as much a best practice as the encrypted storage of passwords on servers (hello Sony!).

Piroggen (vegetarian, and not really Russian)

Piroggen Oh, that was quite a complicated task. Not because of the ingredients, that was rather simple. Nor was it difficult to prepare the filling, that was simple too. But then shaping and folding the pirogues and trying to get as much vegetable into these little beasts, that required some higher manual skill. Well, sometimes you have to tinker in the kitchen. By the way, these are not yeast dough pirogues - it's more like a kind of shortcrust pastry. The reason was that I thought it goes better with the filling. That was rather oriented towards a vegetable cake.

What goes in:

  • 225g flour
  • 145g margarine
  • 1/2 tsp salt
  • 2 eggs
  • 3 small peppers
  • a handful of black olives
  • 1 vegetable onion
  • a small bunch of spring onions
  • a few cloves of garlic (to taste)
  • basil and oregano, pepper, salt
  • another egg for the assembly

The preparation then yourself (the individual parts):

  1. Knead flour, eggs, salt and margarine into a dough
  2. Let the dough disappear in the refrigerator for an hour
  3. Chop onions and garlic finely
  4. Cut peppers into small pieces
  5. Cut olives into small pieces
  6. Preheat oven to 200C
  7. Fry onions and garlic in the pan
  8. Add peppers to the pan, simmer for 5 minutes
  9. Add olives to the pan, season, simmer together for a few more minutes
  10. Cover the pan and push it to the back (it will cook a bit more)
  11. Separate the third egg into egg white and yolk
  12. Take the dough out of the refrigerator and roll it out
  13. Cut round pieces with a large cup (mine were about 10cm in diameter)
  14. Brush the edges with egg white (as glue)
  15. Put the vegetables in the middle of the dough circle and fold it half together, press the edges together
  16. Do this until the dough and/or vegetables are all gone
  17. Then brush the folded pirogues with egg yolk from above
  18. And bake the whole thing for 20 minutes at 200C

The folding was quite complicated. The dough must be quite thin, otherwise you have more dough than filling afterwards, which is rather stupid, so the dough is sensitive and folding and folding more complicated. And the dough should stay - mine rose, I did not glue the sides well enough (I read as a tip to press with a fork afterwards - unfortunately only when mine were already in the oven).

If possible, you should prepare the vegetables early and then wait for the dough in the refrigerator, then the vegetables are not so warm anymore - makes wrapping easier. And the filling can of course be changed freely - for example, one with minced meat is great, cabbage is classic, but other vegetables also work - everything just has to be cut small enough so that you can fold.

Quiche Ratatouille

Well, I had a decent amount of ratatouille left over from yesterday - and I didn't want to eat pasta again (and there's no rice left). So I looked up what you need for a quiche and realized that I pretty much have everything. So I just brought the few missing things from the train station and got started. This went into the quiche:

  • 250g flour
  • 110g margarine
  • 1 tsp salt
  • 50ml water
  • 1+3 eggs (one for the dough and 3 for the topping)
  • 125ml milk
  • 200g sour cream
  • spices
  • ratatouille (I still had some from yesterday, but honestly - it works with any vegetable stir-fry)

The preparation is also quite simple:

  1. Mix 1 egg, flour, water, salt, margarine together and knead into a dough (a three-mix is really practical)
  2. Wrap the dough as a lump in foil and refrigerate for 30 minutes
  3. Preheat oven to 200 degrees
  4. Mix 3 eggs, milk, sour cream and season (taste what and how much spices should go in)
  5. When the dough comes out of the fridge, put it in a springform pan and distribute it (press with your hand) so that it covers the bottom and the edge - about halfway up the edge
  6. Poke holes in the dough at the bottom with a fork and pre-bake for about 15 minutes
  7. Take the cake out and add ratatouille and distribute
  8. Pour the egg mixture over it so that the vegetables are well covered and the cake is filled to the edge. Do not pour over it, otherwise there might be some mess in the oven (not every springform is tight).
  9. Bake for 30 - 40 minutes (it was 35 minutes for me)

The result was surprisingly good. A mistake I made: the dough was a bit thick in the corners and that doesn't taste so great, simply because the dough itself is quite bland. But you can get that right with a bit of practice and finesse. And I can season the egg mixture a bit more, that was also a bit little (ok, I'm rather a light seasoner, so it was ok for me, but Juliana would certainly complain). In any case, a brilliant second use for a vegetable stir-fry.

However, I still have a problem: from yesterday I only had one portion of ratatouille left. Now I have 3/4 quiche left ... (I think I know what I'll be eating this week in between)

Crumb Cake

Today, I didn't cook again because I was in Hamburg yesterday, but I did bake a cake. I like crumble cake, among other things because it is so easy to make. Here's what goes in (for a 26cm springform):

  • 300g flour
  • 200g sugar
  • 150g butter
  • 1 egg
  • 1 pack vanilla sugar
  • 1/2 pack baking powder
  • some cinnamon
  • 250g apricots (so small can)
  • some cranberry jam (optional)

The preparation is then quite simple:

  1. Mix flour, sugar, vanilla sugar, baking powder, cinnamon - all together.
  2. Drain the fruit (the juice with tonic water and ice cubes makes a delicious non-alcoholic cocktail)
  3. Preheat oven to 180C
  4. heat the butter
  5. beat the egg
  6. mix butter and egg into the flour+sugar mixture
  7. knead with a mixer and dough hook until crumbly dough is formed (can also be done with a fork and by hand, but with a mixer it is more comfortable)
  8. distribute 2/3 of the dough in the greased springform and press it onto the bottom, possibly also make a small edge
  9. poke a few times with a fork into the bottom
  10. spread cranberry jam on the bottom of the dough
  11. distribute apricots on top
  12. distribute the remaining crumble dough evenly on top
  13. bake the whole thing in the oven at 180C for 45 minutes
  14. take out and let cool (don't be so greedy, even if it smells good)

You can put cream on it, or just eat it as is or sprinkle powdered sugar on top (though it already makes a pretty sweet cake, so don't overdo it). Tip: don't leave the edge area too thick, otherwise it will be a rather hard part that is difficult to cut later. And of course, you can almost freely vary the fruit, apple puree instead of fruit works great, as do cherries or plum puree (but it will probably be quite firm). The nice thing about it is that you can have almost everything on hand in stock and the eggs you can maybe get from the neighbor (I just didn't expect to want to bake when I went shopping quickly on Saturday).

Stuffed Peppers in a Tomato Bed

Sounds good, doesn't it? Tastes good too. This time, it wasn't terribly complicated to prepare the meal, although you do have to juggle several pots and pans (okay, one pot, one pan). In my case, the pot was a casserole because the peppers were really huge. Here's what I put in:

  • 2 large peppers
  • 250g mixed minced meat
  • 100g feta cheese
  • a handful of olives (black, natural, pitted)
  • 6 tomatoes
  • tomato paste
  • half a bunch of basil
  • 1 onion
  • 1 1/2 cloves of garlic

The preparation is quite simple; you should have a large pot or casserole ready, in which you can comfortably place the peppers. Otherwise, it went like this:

  1. Chop onions, chop garlic, put both on a plate
  2. Chop olives, put on a plate
  3. Chop tomatoes, put on a plate
  4. Chop basil, put on a plate
  5. Preheat the oven to 200° so it can preheat
  6. Heat the pan with olive oil
  7. Sauté onions and garlic in the pan until translucent
  8. Add the minced meat, fry until crumbly (just like for Bolognese - simply use the pan spatula to break up the clumps)
  9. Once the minced meat is nicely browned and mixed with the onions and garlic, add the olives
  10. Reduce heat and add some of the tomatoes (amount about 1-2 tomatoes), add some tomato paste, mix, cover, let it cook for 5 minutes, season with spices (in my case oregano, thyme, pepper, salt)
  11. Remove the lid, increase heat slightly, add the feta cheese and mix well, let it melt
  12. Once the feta cheese is well mixed with the minced meat, turn off the heat
  13. Cut the tops off two peppers, remove the core, remove the "ribs"
  14. Fill the peppers with the minced meat mixture, cover with the tops (previously also remove the core and stem there, basically only a ring remains)
  15. Distribute the remaining tomatoes and basil in the large pot, place the peppers in it
  16. Tip: if they don't want to stand up, make a tripod for the peppers with meat needles at the bottom, then it holds
  17. Put the whole thing in the preheated oven and let it simmer for 20 minutes
  18. After 20 minutes, take it out and serve on plates, blend the tomatoes and basil with a triple mix

For me, as usual with bread, but pasta or rice go wonderfully with it. This doesn't result in such a pudding-soft pepper, it should still have a bite. And the filling is not the usual concrete block, but rather comparable to a thick Bolognese. For me, there was still a beer with it, simply because the weather outside was shouting "beer" with all the sun today.

Threads are great, but not every problem is a nail

If you want to have a good laugh: Node JS and Server side Java Script. Here, someone from the Java camp complains that Node.JS really isn't to be taken seriously and then produces the best example why something like Node.JS (and many other alternatives for server programming) exists - because the Java code gets longer and longer with each step. And even after several iterations for an example that is quite simple to implement in Node.JS (or e.g. with gevent in Python), a few errors and gaps in the Java code are already mentioned in the first comments.

Don't get me wrong - Java has a lot of good solutions for programming with multiple threads in the standard library. Probably the largest selection of possibilities for programming with multiple threads of all currently available languages. But as so often in life: threads are not the answer to all questions of parallelization. Especially when it comes to high request load, the assessment in the comments that 20K threads are already very high is ridiculous - tell that to the programmers of Eve Online, where every ship in their virtual universe is modeled as a microthread.

Java is an interesting platform, precisely because it comes with many low-level libraries with which you can do very interesting things - and which are helpful to build reasonable high-level constructs on top of them. For example, in combination with languages like Clojure or Scala, the thread monster loses some of its terror. But sometimes the answer is not the thread, but asynchronous IO (both for disk access and network access) and the intensive use of coroutines or continuations.

Also, the incomprehension of Java programmers about the approach of solving the multi-core problem simply with several parallel processes and message-passing between them is quite strange in 2011 - after all, 2009 and 2010 were the revival years for Erlang (don't forget, the language has existed for much longer) and the central idea of Erlang is precisely to set network- and CPU-spanning message-passing as the standard in order to achieve very simple parallelizability and scalability.

Java programmers always remind me of the COBOL programmers of my early days, who in every language and every programming approach deliberately picked out the things that were solved differently in COBOL (and sometimes even perhaps a bit simpler) - but then fell flat on their faces when they had to solve real problems outside the COBOL comfort zone with them.

The best thing about Java is the JVM and thus a platform that makes the multi-paradigm and multi-language approaches possible with which you can then use the tools for problems that are appropriate for them. And even then, sometimes the answer is still Node.JS or another small, lean, asynchronous server. Because even with a large collection of various hammers, you will still get a screwdriver for the screw.

Minestrone for the whole family

Well, I simply refused to take less than a whole vegetable each, and suddenly there was quite a lot. So better have a large pot ready! But the minestrone tastes damn good. What I put in:

  • 1 bell pepper
  • 1 zucchini
  • 3 tomatoes
  • 3 celery stalks
  • 1 large carrot
  • 3 small potatoes
  • 100g streaky bacon
  • Parmesan with rind
  • 150g green beans
  • 2 purple onions
  • 2 cloves of garlic
  • 1/2 bunch basil (fresh)
  • 1 tablespoon tomato paste
  • a handful of olives (black, pitted, natural)
  • 1l vegetable broth
  • 50g butter
  • olive oil
  • approx. 50g spaghetti (broken into pieces)

The preparation is really hard work and in the first phase there is not much peace. And it's best to group the vegetables as indicated below, because they go into the pot one after the other - everything is first fried. But otherwise it's actually quite simple:

  1. Dice the bacon and put it on a plate
  2. Dice the onions and garlic and put them in a small bowl
  3. Dice the potatoes, celery, and carrots and put them in a bowl
  4. Chop the beans and put them in a bowl
  5. Dice the bell pepper and zucchini and put them in a bowl
  6. Heat the pot with olive oil and the butter
  7. Fry the bacon for 2 minutes in it (stir vigorously!)
  8. Put the onions with garlic in the pot and fry for 2 minutes (stir vigorously!)
  9. Put the potatoes, celery, and carrots in the pot and fry for 2 minutes (stir vigorously!)
  10. Put the beans in the pot and fry for 2 minutes (stir vigorously!)
  11. Put the zucchini and bell pepper in the pot and fry for 2 minutes (stir vigorously! if your arm feels like it's about to fall off: that's normal, keep stirring!)
  12. Put the lid on, turn down the heat and let the vegetables simmer for 15 minutes, stirring occasionally
  13. The arm can't rest unfortunately, because now the tomatoes are diced and if desired the olives are cut small (I always leave them whole in soups).
  14. Before the 15 minutes are up, throw in the olives (after about half the time)
  15. Chop the basil
  16. When the 15 minutes are up, put the vegetable broth in the pot, stir
  17. Put in the tomato paste, chopped tomatoes and basil, season with pepper and salt
  18. Throw in the parmesan rind (I tie it with kitchen string so it's easy to fish out again)
  19. Bring to a boil, turn down the heat and let it simmer gently for an hour (on my monster pot half a point on the electric stove is enough)
  20. Fish out the parmesan rind and throw it away
  21. Put in the spaghetti and cook for 10 minutes

I then eat the whole thing simply with bread. You can also put sour cream in the soup, or very Italian, grate parmesan on top. With a glass of wine.

Nuclear meltdown in Japan increasingly likely

BBC News - Japan earthquake: Explosion at Fukushima nuclear plant. That was it with the hopes that maybe it would still turn out alright. And people are still running around claiming that something like this could never happen to us, because here everything is much safer. Funny enough, I do remember incidents in cooling systems that were only admitted long after they occurred - and the failure of the cooling system is the problem in Japan, the tsunami and the earthquake were just the triggers.

What I wonder, though, is how will the catastrophe in Japan change our perception of nuclear energy? With Chernobyl and before that Harrisburg, secrecy was relatively easy - but Japan is a country where all inhabitants are highly technologized. The joke about at least 5 cameras per Japanese might be exaggerated, but the number should be high enough to make secrecy more or less absurd. And the high integration into the internet leads to publication channels that were unthinkable in Harrisburg and only conceivable for utopians in Chernobyl.

Surely, energy companies and the government will now show solidarity and talk about how earthquakes and tsunamis in Europe are not a problem. And thus completely miss the actual problem, because as mentioned above, cooling systems can fail not only because of earthquakes and tsunamis. Therefore, such a problem is quite conceivable here as well, if the cooling system fails for other reasons. And why should we believe our energy companies (and the government), who are regularly caught lying, more than the Japanese energy company, which is also known for lying?

It will be difficult for politicians to lie convincingly about such things. And maybe, just maybe, people in Europe will wake up from their wishful thinking that nuclear energy is so safe.

Strange Phenomena in iPhoto

I only use it as an image storage for creating books and syncing to my iPhone and iPad, so my iPhoto is rather unimportant for photo management. But at the moment, I have a phenomenon that is driving me crazy: I imported a CD of normal Jpegs. Then I created an album and put the pictures in it. And now iTunes always claims that this album is empty during sync. The import is also listed as a separate event in iPhoto. iTunes also claims in the sync panel that this event is empty (0 pictures). Accordingly, when syncing "All Pictures," all pictures are transferred - including these pictures. Only the albums and events that consist only of these pictures are not there. Because iTunes thinks they are empty.

What's the point of this? Does anyone out there have an idea? Googling hasn't brought up anything useful, and I've already tried various things (deleting and recreating albums, different ways of creating albums, etc.). The whole thing is quite strange. iLife is quite nice as long as it works, but when problems arise, the whole thing is nearly completely undiagnosable. Which wouldn't matter much to me if I didn't stupidly need it for syncing with my iPhone and iPad...

One reason why I prefer to stick with Lightroom, because I know where the pictures are and the databases are normal sqlite, so I can get my hands on them if necessary. And if they are in the trash, I can reconstruct everything from the pictures and sidecar files. I'll also cross Aperture off the list, its picture management sounds too much like that of iPhoto...

It's quite embarrassing when a tool from Adobe is more reliable and trustworthy than what Apple delivers. Especially since iLife is supposed to be foolproof - when problems arise and necessary troubleshooting is required, it's more of a case of "no user-serviceable parts inside".

Apple just can't do encryption

I fell for it again and thought, I'll just enable the encryption of iPad backups. Pretty stupid. I should have been warned by the debacles with the encrypted home directory. But of course, I did it again. Everything worked fine until today when the backup mess happened - it got stuck in the first step and just wouldn't proceed. Possibly corrupt backup files on the Mac. Ok, the standard procedure is to simply delete the backup in the settings under devices and create a new one. But that doesn't work if you have encryption enabled - it complains, naturally only after all the steps have been completed, that it can't make backups because no session with the iPad can be started. Huh?

And of course, I can't reset the password - it always claims it's wrong (even before I deleted the backup). My suspicion: the password is checked against the backup and if there isn't one, or it's defective, you can't perform a successful check. Resetting the password doesn't work, creating new backups doesn't work, and making iTunes forget the iPad also doesn't work. Before someone thinks they need to tell me I don't know the password: iTunes saves the password in the keychain if requested and yes, the password is the one I enter. And yes, that is definitely the correct one - the device identifier is saved as the account name with the password. And no, this exact password is of course not accepted...

Solution according to Apple? Completely reset the iPad and set it up again. Great, fantastic idea. Sure, many of the data I have are on my Mac, but over time, data have also been added that are not on the Mac. And I would like to transfer those somehow.

By the way, normal backups and restores work - and with unencrypted backups, you can also create a new one if the backups are corrupted. But not if you have encryption enabled.

Frankly, this renewed experience with Apple's inability to build reasonably stable encryption solutions makes me rather skeptical about their full-disk encryption in the upcoming 10.7...

Update: after a few experiments (tested on another computer, iPad backup reconstructed from the TimeMachine backup and tried with it) I suspect the password is also noted on the device - and this note seems to be corruptible. Because even on another device, the definitely correct password is rejected as wrong, and another device also insists on making an encrypted backup (which makes sense, otherwise you could trivially get the data via a backup on another device). The problem is not that it protects itself against manipulation - the problem is that this crap can break and without any external signs - the backups have always worked fine so far, they are just suddenly worthless now (just like the data on the device).

Beef Rolls with Ratatouille

Well, that was today's cooking experiment. Partial success, because I completely miscalculated the quantities (no problem, there's a freezer and delicious rations for another day are also quite nice) and the ratatouille was somehow overcooked today. But that's what happens when you want to handle several pots at once, it doesn't work perfectly the first time. But the roulades were great - although gigantic. Only recommended if you're really hungry! I cooked some ahead of time (using a roasting pan for just one roulade would be silly), gives about 3 servings:

  • 3 horse meat roulades (careful, these things are rather XXL size, one is enough per person!)
  • 300g mixed minced meat
  • Pitted black olives "natural" (about 10-15 pieces, depends on the size - and the taste)
  • 1 onion
  • 3 cloves of garlic
  • Mustard (I had a nice spicy fig mustard, but a simpler medium-spicy one should also work), about 3 tablespoons
  • 1 carrot
  • half a celeriac
  • 1 bay leaf
  • 2 cloves
  • some peppercorns
  • 100ml red wine
  • 200ml broth (probably should have been more, the sauce was a bit meager)
  • 3 bell peppers
  • 1 eggplant
  • 1 zucchini
  • 1-2 tomatoes
  • Basil, oregano, thyme, or whatever spices you like
  • Pepper and salt as usual
  • Olive oil

For the preparation, it's best to start with the roulades and their filling - because that takes the longest and the roulades simmer in the oven, so you can then prepare the ratatouille in peace.

By the way, for the dish, it's advisable to have a roasting pan - those are these gigantic and very heavy pots in an oval shape that you know from your mother's roasts. Expensive, heavy, annoying, but with the size of the horse roulades, don't even think about a normal pot, take a roasting pan right away.

  1. Chop onion and garlic finely
  2. Chop olives finely
  3. While we're at it, chop eggplants finely and soak in salted water (sometimes they are bitter, so they need a bath before use)
  4. Mix minced meat, onions, garlic, and olives thoroughly. You can also add spices (thyme, oregano) here. Basically the distribution: meat spicy, vegetables rather mild, so here the slightly stronger things.
  5. Spread roulades flat (and wonder what you manage to do with them - the things are gigantic!)
  6. Spread each roulade with a tablespoon of mustard
  7. Then distribute the filling
  8. Roll up the roulades and secure with roulade rings or roulade needles (rings worked better for me - two per roulade, because of the size)
  9. Chop celeriac, carrot (and if desired another onion!) finely
  10. Preheat the oven to 170°
  11. Heat oil in the roasting pan, sear the roulades from all sides for a few minutes
  12. Remove roulades, set aside on a plate, add celeriac and carrot pieces to the roulade broth and fry
  13. Add cloves, bay leaf, and peppercorns
  14. Fry everything well, it should leave traces on the vegetables
  15. Deglaze with red wine (pour over)
  16. Bring everything back to a boil
  17. Pour in the broth
  18. Bring everything back to a boil
  19. Put the roulades back in the roasting pan
  20. The liquid should not be too little at the bottom, because some of it will still boil away and then the roulades will become dry, so if necessary, fill up with some water (or if you have any left, broth)
  21. Bring to a boil briefly
  22. Put the roasting pan with the roulades in the oven and let them simmer there for about 1.5 - 2 hours. In between, you should turn the roulades over (1-2 times).

Now that the roulades are simmering - and that takes time! - it's a good time to clean up in the kitchen and wash all the plates and bowls that you used in between because you panicked that there was no surface for the tongs, or the vegetables also had to wait somewhere for their destiny or something similar. Take a short break and relax, ratatouille is actually quite simple. About 30 minutes before the roulades are done, start with the rest:

  1. Chop bell peppers and zucchini into small pieces, keep eggplants separate, they need longer (optimal would be to keep all three vegetables separate)
  2. If desired, add an onion and some garlic - you can simply do this with the roulades above and take some for the ratatouille
  3. Chop tomatoes finely (if you want, you can pre-cook them in hot water and then peel - I don't have problems with tomato skin and save myself the trouble)
  4. Heat oil in the pan
  5. Add eggplants and pre-fry (here you would also add the onions and garlic)
  6. When the eggplants start to become translucent, add the zucchini
  7. When the zucchini starts to become translucent, add the bell peppers and tomatoes
  8. Fry everything thoroughly and season. The tomatoes should fall apart, but the other vegetables should still be recognizable in their shape.
  9. Put the lid on the pan (your pan has a lid, right? If not, it's difficult!) and let it simmer for 5 minutes (that was my mistake, too long, became too soft - the 5 minutes are an estimate from me, I had more)

When the ratatouille is ready, the roulades should also be ready. So take the roulades out of the oven and put them on the plates. If you want sauce: simply strain the vegetable-broth mixture from the roasting pan through a sieve and then add some water to dilute and make sauce with sauce thickener. Is currently still theoretical for me, as there wasn't enough at the bottom to bother, so I just put some of the roast residue on the roulade.

I ate the whole thing again as usual with bread. And with it - since I had it open anyway - a glass of wine. It was very tasty (ok, in a restaurant I would have commented on the very soft ratatouille, but with my own experiments I am quite tolerant with myself), just simply a much too large portion. And the time invested to produce the whole thing was not in proportion to the time in which I devoured the portion. But hey, the weather wasn't so great today, so you can also make such cooking excursions.

Paprika-Bohnen-Suppe mit Hack

And once again, a funny episode from "Cooking with RFC1437". Today, a real man's soup. Okay, for men who aren't too wild about spicy food. So, a little man's soup then.

  • two bell peppers
  • one onion (not too small)
  • 4 cloves of garlic
  • 2 mild peperoni
  • 300g ground meat
  • 400g kidney beans (from the can)
  • 500ml vegetable broth (for me it was only 450, the remaining 50 were in the previous cooking frenzy)
  • tomato paste
  • basil
  • cilantro
  • pepper and salt
  1. Chop or dice the onions (I dice, well, sort of like cubes - just small)
  2. Dice the garlic
  3. Dice the bell peppers
  4. Dice the peperoni and wonder when you'll buy a food processor
  5. Pan hot, add olive oil
  6. Fry the ground meat crumbly (just put in the ground meat and with the spatula keep dividing the clumps until you have nice small crumbly fried ground meat) - don't fry it completely, it will still be cooked in the soup
  7. When the ground meat is done, put it in the pot
  8. Put the onions and garlic in the pan and let them become translucent
  9. Put the bell peppers and peperoni in the pan and fry for a few minutes together with the onions
  10. Take everything out of the pan and put it in the pot
  11. Half a liter of vegetable broth in the pot
  12. The beans in the pot
  13. Basil and cilantro in the soup - I am quite generous with basil
  14. Let it boil, then simmer for 30 minutes (so just not too much bubbling in the pot - on my stove, half a point on full pot is enough for that), stir occasionally
  15. Taste and add pepper and salt

The whole thing then tastes a bit like a mild, slightly sweet chili con carne. I suspect that with chili spice and hot peperoni, the whole thing is also a usable spicy variant. And can certainly be varied cheerfully with the vegetables. The whole thing then yields approximately 4 normal soup bowls.

Cooking with rfc1437 - Mediterranean Pork Medallions

RFC1437 on the Road. If you've been wondering about the constant food photos in the sidebar lately - these are often things I've cooked myself. Since I can't really cook, it's quite a sensation for me when it actually tastes good!

Today: Mediterranean Pork Stir-Fry

  • Pork (cut into 2cm strips), amount depending on hunger
  • one bell pepper (it was a large one, otherwise take two)
  • 400g mushrooms
  • 50ml vegetable stock (for the sauce later)
  • 250ml cream (that was too much, next time I'll only take 200ml)
  • two cloves of garlic (I remove the heart, it smells less, tastes the same)
  • Oregano and sweet paprika (I put that on the meat before frying, it becomes more flavorful then)
  • Pepper, salt (well, that's always there)
  1. Cut meat into strips, dice bell pepper, chop garlic
  2. Pre-season the meat
  3. Heat pan with olive oil
  4. Add meat and fry until it gets a nice crust (not too long, otherwise it will be too dry later, it will still be simmered in the sauce) - for thin strips, 2 minutes seem enough
  5. Remove meat, add mushrooms and sweat (I had to google that, it just means frying until they sweat, so water comes out)
  6. Add vegetable broth, bring to a boil
  7. Add cream, bring to a boil
  8. Taste the sauce, season with oregano, pepper and salt to taste
  9. Add meat and bell pepper, cook in the sauce for 5 minutes
  10. optional: Let the food boil over (bah, uncooperative food won't wait when I'm not looking)

I just eat it with bread - that also helps if, like me, too much sauce is made. And no, I don't think this is particularly calorie-conscious eating. Presumably, the sauce can also be made with something other than cream, but hey, I can't cook, I have to stick to recipes! (that's why it's being blogged, then I can cook the things that taste good to me without having to search long)

You are heroes!

Thanks, Judith Holofernes!

The BILD newspaper is not to be regarded with a wink as trash cultural asset and no harmless “Guilty Pleasure” for well-groomed upstarts, no witty social reference and no lifestyle quote. And least of all is the Bild newspaper what you want to sell it as: hated but largely harmless inventory of an otherwise much smarter Germany.

The Bild newspaper is a dangerous political instrument — not only a strongly magnifying telescope into the abyss, but a malicious being that does not describe Germany, but makes it. With an agenda.

by Wir sind Helden. The advertising campaign of the Blöd with these unspeakable "celebrity quotes", for which people give themselves who should actually know much better, is an unspeakable nonsense. And at least someone says what they really think about it, instead of being instrumentalized for the Blöd. (Obligatory Guttenberg: I shamelessly copied the title from the Schockwellenreiter)

IP Addresses and Privacy

IP Address: Data Protectionists Target AdSense, Amazon Links, and IVW. I don't know, but I think slowly this is starting to overshoot the mark. Yes, data collection should be avoided when avoidable. And certainly, one should always keep in mind what can be done with the data for a central figure like Google. But if this leads to, for example, the Google API Loader for jQuery no longer being usable because their accesses also go to Google servers, or if, as here, complaints are made about Amazon Affiliate links - which only access Amazon when clicked, not generally - then things are getting a bit hairy.

Then we are only a short step away from generally prohibiting links to pages from larger providers. Or absurdities, such as the idea expressed here of the illegality of using Google Mail in Germany. Yes, IP addresses are conditionally personally identifiable. And with IPv6, this will certainly become even more apparent (since there the reuse of IP addresses is not as mandatory as with IPv4). But the IP address is at the same time the central pivot of the Internet, and if one focuses too much on it, one eventually reaches the point where the highest data protection officer prohibits access to the international Internet because one thereby reveals one's IP address to computers outside Germany...

Data protection is to a large extent also the education of users and the self-responsibility of users - the latter can of course only be achieved with an appropriate level of knowledge. I would feel much better if the data protection authorities also produced useful output in the form of citizen information. But there it's somehow bleak.

So, discuss with the large providers and, if necessary, take them to court to force them to comply with data protection guidelines: yes. Public discussion about the problems and dangers: yes. Wildly attacking random forum operators: no.

Why is the data protectionist going after something as irrelevant as mentioned in the article, and not even against one of the big players in the forum industry, such as Heise, Spiegel, Focus, or Golem? Too much respect for the reaction to be expected there?

Something smells fishy about the whole thing. Possibly we don't have all the information - but I can't think of what information might be missing that would make the whole thing an appropriate reaction.

MobileMe sucks hamsters through straws

Twitter / Search - mobileme. It's really shitty when you have to search on Twitter to find out that MobileMe is crap again and you can't log in - because the stupid Apple status page provides no information again. The way Apple operates MobileMe is highly unprofessional. Unfortunately, sync methods other than MobileMe are pretty crap if you want to sync various Macs and iOS devices. And since I naturally searched on Twitter too late again, and had already tried to fix my problem with the support information provided by Apple, I will probably have to reconstruct my MobileMe access on my Air tomorrow or so. Because of course, checking various problems with MobileMe is only possible in a destructive way. Thanks Apple for this waste of time.

what a superb owl

superb owl Probably from here (I only heard about it indirectly via Twitter).

Layout Experiments

Since I'm still figuring out where my new blog is going, there are always some layout experiments. The first iteration was a bit too dark for me, so everything is now a bit lighter. I also noticed that I write a lot of asides (short articles without titles), which can be a bit hard to read, so I've adjusted the font size and line height of asides to match regular posts. Overall, I think it looks a bit more relaxed now.

In response to a question: no, I still don't plan to make the links colorful again - in fact, I've even removed the color from the small date tags. For some reason, color just doesn't seem to work well in my layout right now - except in my pictures, of course. But maybe the darker links are now better to read due to the larger font.

I usually test with Firefox, Chrome, and Safari - so if something doesn't work with IE, I won't notice. If something is broken, you can let me know via my contact form.

Performance hog ...

... one should not necessarily operate. The Tumblr widget needs about 300ms due to its synchronous access to Tumblr, and the typography plugin needs another 300ms. Ouch. Ok, my text just looks typographically crappy, I don't care.

With the cache, I'm currently at xcache - let's see if it produces strange messages like apc. I suspect conflicts between apc and WP Super Cache for my problems, because every time the problem occurred, I also had these strange GC messages from apc in the log. And no messages from PHP, which indicates that nothing was executed by PHP (which also explains the empty output).

There is support for the object cache of xcache in WordPress. Great, you can activate it with a small plugin. I'll save the link, because without this module the blog is faster (and 27 vs 29 DB queries is not exactly a wildly successful operation of the object cache). With the object cache activated, the blog was immediately back in the 700ms+ zone - quite without time-consuming plugins.

For Tumblr, I will now build a JavaScript-Ajax solution that uses the Tumblr API to find my images and push them into the HTML of the page. It's only for the look and then a Tumblr-Connect problem won't be accidentally cached.

But it's a shame about the quite remarkably well-functioning hyphenation from the typography plugin. You can't have everything. Render times in the range of above half a second I find silly. Maybe I'll change my mind again when I've stared at my blog for a while longer.

Bottle

Nice stop motion film with two piles of dirt.

Bottle from Kirsten Lepore on Vimeo.

Owl Content in new home

The Owl Content is now also running on the new box. It wasn't particularly complicated, the hairiest part was a silly bug in the Feedparser used (a terribly old version, it's quite possible that newer versions have already fixed this) and an incompatibility of the PHP gettext implementation, which is still used by Wordpress 1.5. Otherwise, it actually did pretty much everything needed right away. It may be that some hits are still running on the old system until the nameserver changes have spread around, but that should grow out in the next few days. And quite incidentally, the owl now has much stronger wings. It should be enough for the next 8000 posts.

Moving Status

It's finally happening - the static extraction of my old blog for the archive took several days, but now I have created a complete backup, redirected the nameservers, and in the next few days, everything should be on the static part. I am redirecting some of the links directly, for example, the main feed and the homepage itself. Maybe I will gradually redirect other things to the new blog if I notice it's worth it.

The Meta-Owl is still on the old server, I'm afraid I have to tackle that next and see if it can be made to fly on a new system. And then there are the various photo galleries of my subtenants - at least I have already prepared everything for that, the rest is mainly copying all the files and then a bit of layout adjustment.

It looks like I can shut down the old machine in the near future (maybe even by the end of the month?). The machine has served me well for many years, but it is now a bit outdated. But I will put the last backups aside somewhere, because as I know myself, I will realize shortly after shutting it down that I have stored something somewhere that I urgently need ...

Bitrot reloaded

It's time to start over and rebuild. This is an attempt to work with WordPress again. After my own software was rendered obsolete by years of not updating the requirements under the blog, this time there's standard off-the-shelf software. Let's see how it goes.

Bitrot

Got me too. My old blog software probably won't be able to survive unchanged. Old Python version (2.3), old (very old) Django (0.91), old PsycoPG driver (1.0), old PostgreSQL (7.4) and all of that on an old Debian (a wild mix of various versions with backports and custom programs and several failed upgrade attempts). Argh.

Well, I'm still torn between "rewriting" and "throwing away". The latter has the charm that I won't have to carry all that junk around anymore. And honestly, nothing particularly interesting ever happened on my blog anyway. Maybe I can set up a wget mirror beforehand and dump the whole thing somewhere statically, as an archive.

Rewriting also has a lot of charm, but converting thousands of old entries (over 4000 articles and over 4000 links, plus almost 200 images) from 8 years (first entry on November 3, 2002) doesn't sound like fun. And probably thousands of the links are outdated and obsolete anyway.

No idea what I'm going to do, maybe I'll try to bring the Metaeule to the new box first, where I only have the problem that PHP4 is no longer in the Ubuntu repository for 10.04 and I therefore have to force the owl to PHP5 (and that with code based on Wordpress 1.5 - I must really be crazy).

Or I try to install an old Debian with the packages used at that time - the box doesn't run in the front, but behind other machines, so the hacking risk is rather low at that point. The Metaeule of course also has a few thousand posts in the archive (only 8291, that's almost nothing), but if I can keep the old software running (some security patches have been added over time, so it can actually continue to potter along), I don't necessarily have to tackle it.

Somehow, the internet was also such a really bad idea ...

Played Tabblo once

Tabblo: Clemens Church

See my Tabblo

I took these pictures in the Clemens Church in MĂĽnster to show the Church of Juliana. And during my search on the internet, I realized that there are hardly any really nice programs for freely arranging pictures on a page - Tabblo was the most convenient. Somehow strange, it seems that almost everything is focused on presenting pictures in some form of slideshow - in this case, however, I want to create a "loose panorama", i.e. arrange the pictures approximately as they correspond to the actual room. If anyone knows a good software for the Mac that can create such type of websites locally, I would be grateful for tips (possibly it works somehow with iPhoto or iWeb, but maybe there is also something more streamlined).

Basically, I like the idea of Tabblo to make pictures freely arrangeable and to shape them with text (optional) into small picture stories. Something like this as a locally hosted project might also be quite interesting, because somehow the pure slideshow is often quite boring.

La Bamba

We had a great time in St. Petersburg listening to the two of them, so here's one of their songs. And at the same time, this was my premiere using the GF1 to record a video - and don't complain about the shakiness, I didn't have a tripod with me and I was more focused on the sound anyway. But making HD videos with a camera, that's something else ...

The Qualitative Decline of a Provider

There was once a provider that sold root servers in Germany. And the services were good. And the support was good. And oh wonder: you got competent contacts. And hey, you got quick responses. And another thing: when you described problem situations, the contact person on the other side actually understood what it was about. And everything was good.

And then the provider grew. And grew. And somehow the brain didn't grow with it.

Fast-forward to yesterday: I received an email. According to this email, one of my servers had attacked another server in the network. As evidence, a log/dump was attached, which showed this attack traffic. My IP and that of another computer were visible, as well as the protocol: FTP data. So far, so good. If my machine has any guests, I find it good when I am told about it.

First act: contact via ticket and request console - because the network connections are disconnected, there is only a console connection via the web. Oh great: the provider has selected software that doesn't work with proxies - direct access to ports only works when the firewall does NAT (which is less common with corporate firewalls). And with that, it's already 6 hours of forced downtime because there is no other, alternative way - to access a machine with a pure text console setup, I have to use a web console based on Java. Ok, even intelligent companies make dumb decisions from time to time, I thought...

Second act: then at home, the console was started directly. And in between, I pondered the addresses - I know this address, don't I? Hmm, let's see. And correctly: it is in one of my scripts. And it is called once a day, and then shovels some gigabytes of data to this target machine. Why? Well, the target server is the backup server accessible via FTP for data backups ... I should have been suspicious that the original email didn't mention this. Intelligent providers know their own machines. But ok, such a faux pas can happen, take it with humor, what can you do. It's just the internet.

Third act: support message sent - quickly before 18:00, although support is listed until 22:45, but well. No reaction after an hour. Hmm. Called - "oh, yes, I saw the ticket, I'll pass it on again". Two more hours. Then the statement "the traffic was not a backup, it was an attack". What?

Third act drags on for some time, because the support employee simply refuses to accept this clarification. What is unusual about the traffic, he does not want or cannot say. I wouldn't have logged in at all. Strange only that in my research on my machine and the backup server the files have exactly the times specified and thus clearly a login and transfer took place. I can finally only get the data to the FTP server, not conjure it up. Ok, the data is large - but I have 50 GB of backup space, I should be able to fill that. And the volume is around 14-15 GB. Yes, that's a lot.

But no, the support employee insists that I make a written statement. I have to comment on what nonsense they have made at the provider. Doesn't go well. Bad mood.

Fourth act: the next morning first a meeting, then immediately the fax was made and faxed. Waited, worked, occasionally checked the mailbox. Nothing. After 6 hours no reaction. Support ticket expanded with comment, also no reaction. Downtime of the server meanwhile far over 24 hours - because of a misinterpretation on the part of the provider. But he doesn't need to bother about it. Harassment as standard treatment for customers who dare to object?

Too bad that I can no longer recommend Hetzner as a server provider. A pity, after so many years, such a blatant mistake, I really didn't expect that.

(Possible cause: simply the data volume and a misguided intrusion detection system, or maybe the one file that is larger than 2 GB? Who knows - the provider has no interest in clarification, and on my side I can't analyze it, especially not with a machine disconnected from the network).

Update: the fax did not arrive (well, happens sometimes, fax is primitive Stone Age). Kindly, I was allowed to send a photo of the fax, because the next time I am near Stone Age technology is Friday again. And oh wonder - a few minutes after sending the photo comes "the server is back online" - but what surprises me now: in my fax was exactly what I already said in the ticket. And of course there is no explanation, no explanation and also no "Sorry, our mistake". Nothing else. Scratching my head and annoyance over 30 hours of downtime.

Apple and Windows - Fail

Well, for a few days now I have a cute Asus EEE-PC 901 at home. The thing rocks! I thought I'd stick with Windows, so I can at least use some of the familiar programs and sync my bookmarks between the Safaris, etc. And what happens? Apple is a total failure.

First of all: I don't use an admin account under Windows, I use a normal account and only the admin for installation. For this I have also set up my own admin, as I wanted to put my user profiles on drive D: due to the stupid partitioning of the SSD. It all works quite well. But Apple has dumb ears.

On the one hand, Apple still messes around on drive C:. Then I can't specify for all installers that they should install to D:, instead they install on the system partition. Or iTunes: it demands the installer on the disk in the admin's home directory (which was not readable for others before), to then locally install something (whatever) for the user. If that would work, ok. But it doesn't - iTunes runs through the entire installation sequence with searching for MP3s and all that crap every time it starts.

And setting my iTunes media library to another drive doesn't work either - my 16G Class 6 SDHC will probably remain locked for iTunes, because the library is always reset to D:. Rarely stupid.

And MobileMe? Well, it worked once and messed up my bookmarks, but since then it doesn't work anymore - the control panel takes my data and logs me in, but as soon as I reload the panel or go to the sync settings, my user data is supposedly incorrect. Rarely stupid.

All in all, Apple's offering for Windows leaves a rather pathetic impression - sure, some of it may be due to the non-admin user, or the profiles on drive D:, but hey, similar things exist on the Mac as well, and Apple handles them there. So what's lost on Windows?

Safari works well, but against Chrome it doesn't stand a chance on the small Asus box - too slow. Although Google also didn't exactly cover itself in glory with the Chrome installation, which is forced into the user profile (not changeable without hacks).

Do they want to force me to put Linux on the box? Then there's not even the temptation to look at Apple's stuff...

Update: with an admin user, iTunes works - even with the media library on the SDHC card. Do they have a total screw loose at Apple?

Update 2: After finding a tip in a web forum, I uninstalled all the iTunes stuff and QT and then started the Apple Software Updater under my normal account with "Run as..." and then with the admin user and installed iTunes and QT with it. Intuitive is something else, but at least iTunes now works with my non-privileged account. MobileMe still makes its fuss, though, despite installation via this method. Approaches to debug it? None. Oh, and of course the software ends up on C: again instead of D: ...

Oh yeah, and why the hell does every stupid installer put icons on the desktop without asking? Ok, some ask, but the ones from Apple - no. Bah.

Dumb User Interfaces (c) by Apple

What idiot at Apple had the brilliant idea of the icon configuration on the "desktop" of the iPhone? And above all, at whose expense is the great deed of updating a program to place the icon of the update not where the actual program is located, but to quickly place it on the first free spot of the first page with such a thing?

Either you have a cheerful icon shuffling after every update, or you are allowed to search for your icons after a series of updates to see where they have landed again. The pushing of icons is particularly great, of course, if you have all pages full - and if you don't have them all full, updates happily move across the pages. And has anyone ever tried to push an icon from the last page to the first one if you have more than just two pages? No fun, absolutely no fun.

The designer responsible for the iPhone's icon configuration deserves to be slapped. At least for a week. Non-stop.

Banks and the Web

If you want to redefine miserable, you should take a look at banks and their web usage. Has any of the programmers who created this garbage pile received any minimal training? My latest "fun": the practical, low-tech TAN form is being discontinued. Now there's only Smart-TAN via code cards and such stupid devices. Okay, it should actually work - generating session passwords isn't exactly new. But of course, that doesn't work either. Why would it? That would be silly. I mean, honestly, did anyone expect anything different? And of course, my TAN usage is now blocked. Because, it's so incredibly secure when you install technology that doesn't work, and then people are forcibly locked out.

But that's not the reason that bothers me so much. The reason? The bank's notification form. A simple form with a text field in the browser. So far so good. You enter text, which goes directly to the customer advisor. Also good. I mean, that's all I want - write text and that's it. And what happens? I get the great message:

The text is too long. The text may only contain 11 lines with 36 characters each.

Excuse me? Hello? Have you ever seen a text field in a web browser? Is there a column ruler somewhere? Or have you ever heard of flowing text? Should I seriously now manually break my lines to a maximum of 36 characters (which I have to count, as the input field gives me no help) by hand? Have you all lost your minds in the data center?

Oh, and then, after I've formatted and counted (with editor support and cut-and-paste):

This text contains invalid characters. It may only contain digits, letters, umlauts, and an arbitrary and ill-considered selection of special characters.

Parentheses - which you might use when you include a note - no, they are evil and must not be used.

Sorry, but this is a total failure. And no, I don't want to hear anything about your downstream banking system only accepting 11 lines of 36 characters - I don't care as a customer. Giving that as a reason only shows how stupid you are and how little you know about the subject. Sorry, colleagues, but this is pathetic.

Absurdities of the Media

In the doping circus, there's once again a real howler that I saw today in the Tour ticker:

Dear cycling friends! For good reason (the case of Patrik Sinkewitz), coverage of the 2007 Tour de France will initially be suspended until the allegations against Sinkewitz are clarified. We thank you for your interest! À bientôt, your tour.ARD.de team

Ok. So another rider under doping suspicion. Just like it's been all the time. Nothing really new - Team T-Mobile probably has a lot more to work through. Uh - and what is the point of stopping the coverage now? I mean, it was already pathetic enough - stopping it now changes what exactly?

The media could of course, instead of the great outrage, think about how much they themselves are involved in the whole mess. Wasn't there something about direct cooperation between ARD reporters and Deutsche Telekom? Wasn't there the years-long hype of the media around Jan Ullrich and the free mega-advertising for Deutsche Telekom? Wasn't it the case that for the German media, hardly anything else existed except Telekom and their team?

And all of this is completely innocent in the pressure that was built up? The Tour commentators this time had sayings ready that one should also talk about the non-winners - and yet again, as every year, they have fueled the winner speculation and stirred up the hype. But certainly, stopping the coverage will help the non-doping riders (if there are any, I don't know if that's the case) tremendously. And ARD is setting a real sign - of their own stupidity and arrogance, because the media, the sports associations, and organizations have diligently built up all this nonsense.

Cycling is left in the lurch - but the hypocritical directors and functionaries couldn't care less about that. Doping once brought in ratings, so it was ignored. Now they're just playing the big critic - and make just as ridiculous a picture as back then.