Index ¦ Archives ¦ RSS

Monarchs

Every year some some Monarch butterflies overwinter in Santa Cruz. I finally went to have a look.

There weren't quite as many as I expected but that is probably due to the weather and the afternoon visit.

2014 Monarchs

Photos along with some of West Cliff Drive and the sky. The video (no audio) is looking up and around. It is a bit shaky due to being zoomed in and image stabilisation!

Category: misc


Long Read: Motorbiking around Angola

This is a story I reread about once a year and recommend to everyone who will listen. In 2007 some South Africans took their motorbikes around Angola. There are lots of pictures, stories about the Angolan people, touching parts, motorbiking and just an overall wonderful read.

The first part is here. Part two is a few posts down. That part then has links to each successive part.

Category: misc – Tags: recommendation


Moving to Github

I have two active open source projects, and have been hosting them at Google Code. Now I'm moving them to Github. I'm still tidying up odds and ends.

Google Code (old home) Github (new home)
https://code.google.com/p/apsw/ https://github.com/rogerbinns/apsw
https://code.google.com/p/java-mini-python/ https://github.com/rogerbinns/jia-mini-python

So let's start with some of the good things about Google Code:

  • Mercurial: I prefer Mercurial as my DVCS of choice, and it is well supported.

  • Decent Web Interface: Because code hosting sites are built by developers they tend to have idiosyncratic usability (architecture astronaut style), with Google Code being the least worst. For example on Github you can't even sort issues by priority. It is completely absurd.

  • Multiple repositories per project: The popular hosting services have an issue tracker, wiki, links etc per project. With the exception of Google Code, you only get one source repository per project. DVCS encourages a style of single purpose smaller repositories versus the large agglomeration that was more typical in the subversion and CVS days.

    It is far more pleasant being able to use a single issue tracker, wiki etc for a group of related source repositories. Only Google Code does this.

So why leave?

  • Downloads terminated: I have 14 files per release of APSW (most are Windows binaries since Windows developers rarely have compilers and related installed) and one for the other project. Google no longer want downloads, yet that is the primary way others use my projects. Google also do not believe in supporting Linux for Drive, not to mention that it is a terrible alternative.
  • No future: There is no indication of any ongoing interest in Google Code and they refuse to take money for it. They keep shutting down services, and it is good practise to not be dependent on them. (No customer service, erroneous disabling of accounts etc.)

There are three choices for where to go:

  • Sourceforge: No chance
  • Bitbucket: I used them in the past, but they couldn't support personal and corporate use at the same time. They supported Mercurial, but Atlassian means an enterprisey design (I hate using Jira). Most importantly when I asked if they would guarantee availability of a download service for any period of time they said they would not. So they are out.
  • Github: There are many things I don't like about Github, but at the end of the day they are the least worst, and only practical alternative.

This is how I converted my projects to github:

  • Use fast-export to convert the Mercurial history to Git. (In theory you can talk Mercurial to Github but it isn't worth it. Also THG lost the ability to do line by line commits.)

  • Use github pages to generate documentation into. However do not follow all those instructions - in particular the gh-pages branch needs to be created as an orphan as it has nothing to do with the master branch and its files. (Better instructions).

    The pages didn't render correctly, which was because they returned 404 for stylesheets and images. You have to do some magic to fix it.

  • Use Google Code Issues Migrator to copy all existing issues over to Github and keep the same ids.

  • Grep the source tree for all mentions of `code.google` and fix them.

  • Edit the projects at Google Code to say they have moved

Before doing the real projects I experimented on a test repository to make sure that the documentation and releases worked. Sadly releases have to be done manually because there don't appear to be any usable command line clients (not even github's one) and none of the libraries I looked at did releases either.

Category: misc – Tags: github, apsw, google


The Apple Hegemony

There is lots of debate over Apple's success mainly about quantity (Android's sweet spot) vs quality/revenue (Apple's sweet spot).

I used Fing to see what devices were on the network on a plane and at a hotel. You can freely download it.

Plane wifi

I was on a plane

Hotel wifi

And at a hotel

Category: misc – Tags: apple


A Santa Cruz day (2013)

The Santa Cruz wharf is turning 100 and there were some celebrations. There were some cool old cars, various exhibits including old photos, and various people speaking about things that didn't interest me. You can see some old photos, as well as details on the six different wharves here. While cycling on West Cliff Drive later, there was a spot inundated with birds.

2013 Santa Cruz Day

Birds ...

Category: misc


Maligned by Github

Gtihub screenshot

See the problem?

That is what Github tells the world about me. It isn't true and they won't let me change it. The problem area I am referring to is Java. The implication is that I have only one language talent, and implicitly that I endorse Java. While Java was a good solution in its first half-decade of life, it has been an increasingly poor solution since then, in my opinion. I did ask Github support to remove that, and they refused.

They "calculate" it from your public Github activity. Not even that - they only calculate it from your public activity on public projects that don't belong to an organisation. For work I did a whole bunch of C# and Python for a public Unity plugin but even that doesn't dilute the Java.

Github refuse to consider putting your private activity into that calculation. We'd have no problem with that at all. Heck we are hiring and would want people to see that Python is something we do a lot of, as well as Javascript, CSS, HTML, Objective-C, C and even on occasion Java when doing Android work. The Github profile should help viewers get a better understanding of the person, not malign them.

Gtihub screenshot

What I really do with Github

Gtihub screenshot

What they tell the world

Category: misc


Triple Vision

At work I've been doing a lot more frontend work. This involves writing HTML templates, CSS and Javascript. I also do the backend work (web services, databases, processing) and my experience is that the more you can see at once, the more productive you can be [1]. This involves multiple terminals, browsers, admin tools, DOM inspectors, consoles and editors.

For many years I've always used the biggest monitors I could afford/find [2]. There are various reports of multi-monitor productivity, but it is hard to distinguish between total display real estate versus the number of monitors. For me dual monitors has been how I increase my display real estate. (Obligatory Onion, web coding experience, engineering culture)

My motherboard has 4 video outputs (3 digital, one analogue) so adding a third monitor should be a no brainer. Heck there is even a video showing that it works just fine. I purchased a third monitor from Costco, and plugged it in to the VGA port (I needed to order a digital cable so it was a stopgap). It didn't work, with me ending up with 0, 1 or 2 working displays but never 3. Even more amusing was how confused everything got - the BIOS boot seems to pick the oldest connector technology to display on, while Linux would keep remembering which display was primary in some convoluted way that meant it was almost always wrong.

Eventually I tracked down the problem. Some programmable hardware known as a CRTC is needed to read the display memory at a certain rate suitable for the outputs it is connected to. Despite having 4 outputs, there are only 2 CRTCs. A CRTC can be hooked up to multiple outputs providing the monitor/output clocking is identical which usually requires using identical monitors (the same resolution is not sufficient).

With a heavy heart I decided to retire my smaller 16:10 monitor and get another monitor from Costco to replace it [3]. I now had two identical monitors that could share a CRTC and my original largest 16:10 to use the other one. Because of connectors and cables, my largest needed to use the DisplayPort connection with an adapter to HDMI.

You know what is coming next - the display wouldn't get a signal. It turns out that not only did they cheap out on the CRTCs, but they also cheaped out on the DisplayPort connection. Despite there being a spare CRTC and DisplayPort supporting multiple final outputs, they didn't bother. I had to buy an active adapter.

Finally after all that my 3 screens work. I have a 28 inch 16:10 display in the centre (1920x1280), and 27 inch 16:9 screens on either side (1920x1080). Except under Windows which detects all 3 screens just fine, but refuses to save the settings when you enable the third. I don't use Windows for real work so it doesn't matter.

What should have taken 30 minutes turned into a several week saga. Apparently computers are supposed to increase productivity.

[1]When I first started programming as a kid on Apple IIs and Sinclair Spectrums, the screens were small and you'd be lucky to get something like 32 x 20 characters on the screen. There was the slow trend of monitors getting slightly larger - 11 inches being replaced by 12, 12 by 13 etc. I don't remember those as unproductive times. By 2000 I was spending thousands on 19 and 21 inch monitors when I could.
[2]When working professionally in the 90s my colleagues got various workstations. I picked the NCD 19c which was a whopping 19 inch display, 1280 x 1024 resolution and an X terminal (dumb display device). The display was great and I had run apps all over the place, displaying to the terminal. I didn't switch to a different primary device until 1999 or so.
[3]

There was also a detour via Nvidia. The only PC game I play is 10 years old and my work is mostly text editors, browsers and terminals. This doesn't require high end acceleration. I got an Nvidia card (low end 6100). It had two outputs and in theory would work with the Intel HD4000 for the third monitor.

In practise things didn't work well, especially on Linux where Nvidia's driver doesn't support standard display drivers, and is a closed binary so only Nvidia can add that functionality. It turns out there is a beta driver that did support it, but then refused to cooperate with the Intel driver. On Windows it worked fine.

I got a second Nvidia card since they claimed it would work via a "mosaic" mode. After eventually getting the right voodoo in the configuration (and it working fine on Windows), their driver refused to do it on the grounds that the chipset wasn't authorised, implying that if I had spent $400 per card then they'd be okay doing it. The cards were also noisy. I gave up on Nvidia and got a quieter machine back and no arbitrary drivers that error on authorisations.

Category: misc


Weather

On my Android devices I use the AIX Weather Widget which shows a 24 hour forecast. It is a nice clean visual style.

aix screenshot

Northern California looks like this for the most of year. It also shows levels of rainfall, below freezing temperatures etc.

By my bedside I've had a "weather station" for about a decade, that has an outdoor wireless sensor. It shows indoor and outdoor temperatures, has an "atomic" clock, and even projects time and other fields on the ceiling. Rather nifty and very robust. I've changed the batteries on the outdoor sensor twice, and on the main unit once (it is plugged into an AC adaptor so they are backup). It also has weather forecasting that is a work of fiction.

Sadly yesterday the outdoor sensor died, and I now have to make a decision. The outdoor temperature was very useful. Getting the temperature from the Internet isn't as accurate or as up to date. I decided to get a new unit, and there are a huge number of choices, but none matching what I already have and was happy with. (The outdoor sensor is sold as a replacement but costs more than a new weather station!)

I can reveal what all online reviews say for any model from any manufacturer:

  • 10% say it is wonderful, works for years, was easy to setup etc
  • 90% say it stopped receiving the outdoor sensor information within days or weeks. Also the customer service sucks. Setup was difficult. Some even measured with other devices and found the margin of error to be far too large (several degrees despite showing a decimal point implying a tenth of a degree accuracy).

Unfortunately Amazon don't give an indication of sales volume. For example if 10 million purchases resulted in 100 complaints then that seems okay, but 1,000 purchases with that volume means stay away from that brand!

Looking across different models and manufacturers there are a variety of issues:

  • No indication of expected battery life
  • Can measure wind speed (good), but the sensor appears too small to be accurate
  • No ac power for receiver (which means lots more battery changes)
  • No manuals online
  • No idea of release date (one has Copyright 2013 at the bottom of each page)
  • Record minimum and maximum temperatures (good), but wipe them at midnight so you can't see what they were for the previous day.
  • Require pushing buttons to see things (eg can't see indoor and outdoor temps at the same time)
  • Cheaper units having a time display but not automatically set
  • Only showing some little used bizarre measurement system instead of the correct thing

No one seems to use solar power for the outdoor sensor.

La Crosse have a maze of identical looking products with different model numbers and prices.

I feel some allegiance to them due to the longevity of my current unit, but then most complaints are about them too. I did consider going analog/old school hoping not to do gloves but that too has several problems.

So far my experience has involved trying to buy local. The first retailer charged 20% more than Amazon, and their Internet price matching meant waiting in a long line to get to the checkout, only to be told I had to go to another person at the other end of the store to get some piece of paper, and then return to the checkout queues. I wasn't jumping through those hoops, and left.

Another location had a different brand and unit that seemed good, but turned out terrible in practise. The screen has very limited viewing angles and a reflective face plate. (It also couldn't do a 24 hour clock.)

So Amazon it is for a La Crosse unit.

Category: misc – Tags: weather, android


Copyright

It amuses me just how many sites insist on putting Copyright YEAR on their pages. Some like Google even put it on their emails. For many decades it has been the case that expression is automatically copyright. (Before that you had register. You can also register now, but putting that on your page does not count.) So the Copyright bit serves absolutely no useful purpose other than to litter your page.

The year is even worse. Copyrights last a long time, certainly long after the site has ceased being remotely relevant. In any event they apply to the original date of the expression, not to whatever the page footer says.

I generally see two years displayed. One is the current year, and it is often for content that was obviously not created this year (eg there will be a date stamp from earlier). They are just automatically using the current year which is wrong and misleading.

And then there are the folk who have some random year in the past. What I see on reading that is:

This content is stale, outdated and unmaintained. We don't care about it and neither should you. This has been happening since ...

—Copyright 2011

Google Email screenshot

A recent email from Google. One the other day said 2011.

Apple email screenshot

Apple of course go over the top. I doubt they are trademarking each email sent.

Appengine screenshot

The appengine main selector page. I seriously doubt it is unchanged in over 5 years. According the web server some of it was last changed in 2012, and the rest is so fresh and changing that under no circumstances should the browser consider using a cached copy.

You can see how they made the text less noticeable using grey. But that is still just visual litter. Legally and visually, having nothing is the most effective. Please help stamp out this scourge.

Category: misc


SSL still sucks

In a fit of madness I decided to replace the self signed SSL certificate I use with a proper one. The self signed one I made was good for 10 years so I didn't have to deal with renewal nonsense. Various clients did whine a bit, but usually there was some setting to tell them it was ok. The biggest problem with a self signed cert is you generally can't tell the difference between it and a middleman intercepting your connection.

SSL certificates are a curious business. Just like with the credit rating agencies and the banking crisis, the incentives and payments are all wrong. The certificate is analogous to an identity document like a drivers license or a passport. Identity documents are trusted not because of the information on them, but rather who issued them. My British passport is only trusted around the world because the British government is trusted as the issuer. A self signed SSL certificate is analogous to printing your own passport saying it comes from the Republic of You.

To get a SSL certificate you engage with a certificate authority. They will verify your identity information to some degree, accept your money and issue the certificate. But the people who care about your identity are the ones connecting to your site, and they haven't been involved in this process. The browser and operating system manufacturers handily include a list of trusted certificate authorities, but the way it works is that any name will be accepted providing any of those authorities issued it!

Here is the start of the long list of trusted authorities:

CA

If any of those issued a certificate for my site, your browser would trust it, or for amazon.com or microsoft.com [1] for that matter. For the certificate authority businesses to operate, they need to remain in those trusted lists, but also need to make it easy to exchange money for certificates.

Their solution is multiple "classes" - they create intermediate certificate authorities [2] and each of those then has different requirements. For example a class 1 might check there is a working email address associated with the site, while class 3 may involve people doing strong verification including getting business documents and making phone calls. They charge more for the latter.

But it is rather pointless. Your browser accepts class 1 [3] certificates that cost $20 with minimal verification as well as class 3 that cost the site thousands. As an end user, you could check the certificate for each site you visit, determine if you trust the certificate authority, go to their web site to read up what their statements are for that class and decide if it acceptable risk to you. You'll also notice they'll have some legalese disclaiming any liability, making it essentially worthless.

This is all a long winded way of saying that it doesn't really matter who issues your certificates, there are no real assurances behind them any way, and nobody checks. They wouldn't be too different from real world passports printed on tissue paper that only machines ever look at. Consequently I used the free StartSSL certificate authority.

The sign up process is annoying and tedious, largely to try to give the appearance of value and security. New certificate in hand, I then had to replace my self signed certificate. This was more complicated because I also had to include the intermediary authority information.

The various applications I tried all worked perfectly, except those from Mozilla. In the olden days they would give detailed information about SSL issues, but it was gobbeldy gook, so naturally most users clicked whatever they could to get them to the site as quickly as possible. It is virtually impossible to tell the difference between misconfiguration and security breaches. The solution was to hide all that and try to show the minimum amount. Which leads us to today.

Thunderbird doesn't like the certificate for reasons I can't determine. It doesn't even show an error, so it looks like it is working but in reality it is repeatedly hanging up on the server. When using a self signed certificate it at least puts up an error dialog where you can say to permanently accept the certificate. In this case the behaviour for a legitimate certificate is far worse! Going through several arcane menu sequences you can finally make it work, but this is ridiculous.

Firefox initially worked and then stopped. In the end I worked out that it had OCSP issues [4]. I can't really tell who is to blame, and ended up having to turn OCSP off.

Now I have a real certificate, it works providing you go nowhere near Mozilla products, and isn't worth anything anyway. Madness indeed.

[1]There are some minimally deployed attempts to fix this like certificate pinning.
[2]SSL provides a chain of certifying authorities. It would be analogous to getting a passport from England that is then stamped by the British government. You trust the England intermediary because you trust the British government.
[3]The classes are made up by each certificate authority.
[4]OCSP allows checking if a certificate has been revoked. If you lost the private part, someone could impersonate your site or read the traffic, so you would ask the issuer to add it to a list of certificates not to be trusted.

Category: misc – Tags: ssl, thunderbird, mozilla

Contact me