Index ¦ Archives ¦ RSS

The Apple Hegemony

There is lots of debate over Apple's success mainly about quantity (Android's sweet spot) vs quality/revenue (Apple's sweet spot).

I used Fing to see what devices were on the network on a plane and at a hotel. You can freely download it.

Plane wifi

I was on a plane

Hotel wifi

And at a hotel

Category: misc – Tags: apple


A Santa Cruz day (2013)

The Santa Cruz wharf is turning 100 and there were some celebrations. There were some cool old cars, various exhibits including old photos, and various people speaking about things that didn't interest me. You can see some old photos, as well as details on the six different wharves here. While cycling on West Cliff Drive later, there was a spot inundated with birds.

2013 Santa Cruz Day

Birds ...

Category: misc


Maligned by Github

Gtihub screenshot

See the problem?

That is what Github tells the world about me. It isn't true and they won't let me change it. The problem area I am referring to is Java. The implication is that I have only one language talent, and implicitly that I endorse Java. While Java was a good solution in its first half-decade of life, it has been an increasingly poor solution since then, in my opinion. I did ask Github support to remove that, and they refused.

They "calculate" it from your public Github activity. Not even that - they only calculate it from your public activity on public projects that don't belong to an organisation. For work I did a whole bunch of C# and Python for a public Unity plugin but even that doesn't dilute the Java.

Github refuse to consider putting your private activity into that calculation. We'd have no problem with that at all. Heck we are hiring and would want people to see that Python is something we do a lot of, as well as Javascript, CSS, HTML, Objective-C, C and even on occasion Java when doing Android work. The Github profile should help viewers get a better understanding of the person, not malign them.

Gtihub screenshot

What I really do with Github

Gtihub screenshot

What they tell the world

Category: misc


Triple Vision

At work I've been doing a lot more frontend work. This involves writing HTML templates, CSS and Javascript. I also do the backend work (web services, databases, processing) and my experience is that the more you can see at once, the more productive you can be [1]. This involves multiple terminals, browsers, admin tools, DOM inspectors, consoles and editors.

For many years I've always used the biggest monitors I could afford/find [2]. There are various reports of multi-monitor productivity, but it is hard to distinguish between total display real estate versus the number of monitors. For me dual monitors has been how I increase my display real estate. (Obligatory Onion, web coding experience, engineering culture)

My motherboard has 4 video outputs (3 digital, one analogue) so adding a third monitor should be a no brainer. Heck there is even a video showing that it works just fine. I purchased a third monitor from Costco, and plugged it in to the VGA port (I needed to order a digital cable so it was a stopgap). It didn't work, with me ending up with 0, 1 or 2 working displays but never 3. Even more amusing was how confused everything got - the BIOS boot seems to pick the oldest connector technology to display on, while Linux would keep remembering which display was primary in some convoluted way that meant it was almost always wrong.

Eventually I tracked down the problem. Some programmable hardware known as a CRTC is needed to read the display memory at a certain rate suitable for the outputs it is connected to. Despite having 4 outputs, there are only 2 CRTCs. A CRTC can be hooked up to multiple outputs providing the monitor/output clocking is identical which usually requires using identical monitors (the same resolution is not sufficient).

With a heavy heart I decided to retire my smaller 16:10 monitor and get another monitor from Costco to replace it [3]. I now had two identical monitors that could share a CRTC and my original largest 16:10 to use the other one. Because of connectors and cables, my largest needed to use the DisplayPort connection with an adapter to HDMI.

You know what is coming next - the display wouldn't get a signal. It turns out that not only did they cheap out on the CRTCs, but they also cheaped out on the DisplayPort connection. Despite there being a spare CRTC and DisplayPort supporting multiple final outputs, they didn't bother. I had to buy an active adapter.

Finally after all that my 3 screens work. I have a 28 inch 16:10 display in the centre (1920x1280), and 27 inch 16:9 screens on either side (1920x1080). Except under Windows which detects all 3 screens just fine, but refuses to save the settings when you enable the third. I don't use Windows for real work so it doesn't matter.

What should have taken 30 minutes turned into a several week saga. Apparently computers are supposed to increase productivity.

[1]When I first started programming as a kid on Apple IIs and Sinclair Spectrums, the screens were small and you'd be lucky to get something like 32 x 20 characters on the screen. There was the slow trend of monitors getting slightly larger - 11 inches being replaced by 12, 12 by 13 etc. I don't remember those as unproductive times. By 2000 I was spending thousands on 19 and 21 inch monitors when I could.
[2]When working professionally in the 90s my colleagues got various workstations. I picked the NCD 19c which was a whopping 19 inch display, 1280 x 1024 resolution and an X terminal (dumb display device). The display was great and I had run apps all over the place, displaying to the terminal. I didn't switch to a different primary device until 1999 or so.
[3]

There was also a detour via Nvidia. The only PC game I play is 10 years old and my work is mostly text editors, browsers and terminals. This doesn't require high end acceleration. I got an Nvidia card (low end 6100). It had two outputs and in theory would work with the Intel HD4000 for the third monitor.

In practise things didn't work well, especially on Linux where Nvidia's driver doesn't support standard display drivers, and is a closed binary so only Nvidia can add that functionality. It turns out there is a beta driver that did support it, but then refused to cooperate with the Intel driver. On Windows it worked fine.

I got a second Nvidia card since they claimed it would work via a "mosaic" mode. After eventually getting the right voodoo in the configuration (and it working fine on Windows), their driver refused to do it on the grounds that the chipset wasn't authorised, implying that if I had spent $400 per card then they'd be okay doing it. The cards were also noisy. I gave up on Nvidia and got a quieter machine back and no arbitrary drivers that error on authorisations.

Category: misc


Weather

On my Android devices I use the AIX Weather Widget which shows a 24 hour forecast. It is a nice clean visual style.

aix screenshot

Northern California looks like this for the most of year. It also shows levels of rainfall, below freezing temperatures etc.

By my bedside I've had a "weather station" for about a decade, that has an outdoor wireless sensor. It shows indoor and outdoor temperatures, has an "atomic" clock, and even projects time and other fields on the ceiling. Rather nifty and very robust. I've changed the batteries on the outdoor sensor twice, and on the main unit once (it is plugged into an AC adaptor so they are backup). It also has weather forecasting that is a work of fiction.

Sadly yesterday the outdoor sensor died, and I now have to make a decision. The outdoor temperature was very useful. Getting the temperature from the Internet isn't as accurate or as up to date. I decided to get a new unit, and there are a huge number of choices, but none matching what I already have and was happy with. (The outdoor sensor is sold as a replacement but costs more than a new weather station!)

I can reveal what all online reviews say for any model from any manufacturer:

  • 10% say it is wonderful, works for years, was easy to setup etc
  • 90% say it stopped receiving the outdoor sensor information within days or weeks. Also the customer service sucks. Setup was difficult. Some even measured with other devices and found the margin of error to be far too large (several degrees despite showing a decimal point implying a tenth of a degree accuracy).

Unfortunately Amazon don't give an indication of sales volume. For example if 10 million purchases resulted in 100 complaints then that seems okay, but 1,000 purchases with that volume means stay away from that brand!

Looking across different models and manufacturers there are a variety of issues:

  • No indication of expected battery life
  • Can measure wind speed (good), but the sensor appears too small to be accurate
  • No ac power for receiver (which means lots more battery changes)
  • No manuals online
  • No idea of release date (one has Copyright 2013 at the bottom of each page)
  • Record minimum and maximum temperatures (good), but wipe them at midnight so you can't see what they were for the previous day.
  • Require pushing buttons to see things (eg can't see indoor and outdoor temps at the same time)
  • Cheaper units having a time display but not automatically set
  • Only showing some little used bizarre measurement system instead of the correct thing

No one seems to use solar power for the outdoor sensor.

La Crosse have a maze of identical looking products with different model numbers and prices.

I feel some allegiance to them due to the longevity of my current unit, but then most complaints are about them too. I did consider going analog/old school hoping not to do gloves but that too has several problems.

So far my experience has involved trying to buy local. The first retailer charged 20% more than Amazon, and their Internet price matching meant waiting in a long line to get to the checkout, only to be told I had to go to another person at the other end of the store to get some piece of paper, and then return to the checkout queues. I wasn't jumping through those hoops, and left.

Another location had a different brand and unit that seemed good, but turned out terrible in practise. The screen has very limited viewing angles and a reflective face plate. (It also couldn't do a 24 hour clock.)

So Amazon it is for a La Crosse unit.

Category: misc – Tags: weather, android


Copyright

It amuses me just how many sites insist on putting Copyright YEAR on their pages. Some like Google even put it on their emails. For many decades it has been the case that expression is automatically copyright. (Before that you had register. You can also register now, but putting that on your page does not count.) So the Copyright bit serves absolutely no useful purpose other than to litter your page.

The year is even worse. Copyrights last a long time, certainly long after the site has ceased being remotely relevant. In any event they apply to the original date of the expression, not to whatever the page footer says.

I generally see two years displayed. One is the current year, and it is often for content that was obviously not created this year (eg there will be a date stamp from earlier). They are just automatically using the current year which is wrong and misleading.

And then there are the folk who have some random year in the past. What I see on reading that is:

This content is stale, outdated and unmaintained. We don't care about it and neither should you. This has been happening since ...

—Copyright 2011

Google Email screenshot

A recent email from Google. One the other day said 2011.

Apple email screenshot

Apple of course go over the top. I doubt they are trademarking each email sent.

Appengine screenshot

The appengine main selector page. I seriously doubt it is unchanged in over 5 years. According the web server some of it was last changed in 2012, and the rest is so fresh and changing that under no circumstances should the browser consider using a cached copy.

You can see how they made the text less noticeable using grey. But that is still just visual litter. Legally and visually, having nothing is the most effective. Please help stamp out this scourge.

Category: misc


SSL still sucks

In a fit of madness I decided to replace the self signed SSL certificate I use with a proper one. The self signed one I made was good for 10 years so I didn't have to deal with renewal nonsense. Various clients did whine a bit, but usually there was some setting to tell them it was ok. The biggest problem with a self signed cert is you generally can't tell the difference between it and a middleman intercepting your connection.

SSL certificates are a curious business. Just like with the credit rating agencies and the banking crisis, the incentives and payments are all wrong. The certificate is analogous to an identity document like a drivers license or a passport. Identity documents are trusted not because of the information on them, but rather who issued them. My British passport is only trusted around the world because the British government is trusted as the issuer. A self signed SSL certificate is analogous to printing your own passport saying it comes from the Republic of You.

To get a SSL certificate you engage with a certificate authority. They will verify your identity information to some degree, accept your money and issue the certificate. But the people who care about your identity are the ones connecting to your site, and they haven't been involved in this process. The browser and operating system manufacturers handily include a list of trusted certificate authorities, but the way it works is that any name will be accepted providing any of those authorities issued it!

Here is the start of the long list of trusted authorities:

CA

If any of those issued a certificate for my site, your browser would trust it, or for amazon.com or microsoft.com [1] for that matter. For the certificate authority businesses to operate, they need to remain in those trusted lists, but also need to make it easy to exchange money for certificates.

Their solution is multiple "classes" - they create intermediate certificate authorities [2] and each of those then has different requirements. For example a class 1 might check there is a working email address associated with the site, while class 3 may involve people doing strong verification including getting business documents and making phone calls. They charge more for the latter.

But it is rather pointless. Your browser accepts class 1 [3] certificates that cost $20 with minimal verification as well as class 3 that cost the site thousands. As an end user, you could check the certificate for each site you visit, determine if you trust the certificate authority, go to their web site to read up what their statements are for that class and decide if it acceptable risk to you. You'll also notice they'll have some legalese disclaiming any liability, making it essentially worthless.

This is all a long winded way of saying that it doesn't really matter who issues your certificates, there are no real assurances behind them any way, and nobody checks. They wouldn't be too different from real world passports printed on tissue paper that only machines ever look at. Consequently I used the free StartSSL certificate authority.

The sign up process is annoying and tedious, largely to try to give the appearance of value and security. New certificate in hand, I then had to replace my self signed certificate. This was more complicated because I also had to include the intermediary authority information.

The various applications I tried all worked perfectly, except those from Mozilla. In the olden days they would give detailed information about SSL issues, but it was gobbeldy gook, so naturally most users clicked whatever they could to get them to the site as quickly as possible. It is virtually impossible to tell the difference between misconfiguration and security breaches. The solution was to hide all that and try to show the minimum amount. Which leads us to today.

Thunderbird doesn't like the certificate for reasons I can't determine. It doesn't even show an error, so it looks like it is working but in reality it is repeatedly hanging up on the server. When using a self signed certificate it at least puts up an error dialog where you can say to permanently accept the certificate. In this case the behaviour for a legitimate certificate is far worse! Going through several arcane menu sequences you can finally make it work, but this is ridiculous.

Firefox initially worked and then stopped. In the end I worked out that it had OCSP issues [4]. I can't really tell who is to blame, and ended up having to turn OCSP off.

Now I have a real certificate, it works providing you go nowhere near Mozilla products, and isn't worth anything anyway. Madness indeed.

[1]There are some minimally deployed attempts to fix this like certificate pinning.
[2]SSL provides a chain of certifying authorities. It would be analogous to getting a passport from England that is then stamped by the British government. You trust the England intermediary because you trust the British government.
[3]The classes are made up by each certificate authority.
[4]OCSP allows checking if a certificate has been revoked. If you lost the private part, someone could impersonate your site or read the traffic, so you would ask the issuer to add it to a list of certificates not to be trusted.

Category: misc – Tags: ssl, thunderbird, mozilla


All Change

Things look different here. The hand written HTML and styles were from a decade ago, and didn't really support my resume's claim of long time proficiency in the web arts. (It was of course "cobbler's children have no shoes" syndrome.)

I also originally supported Google Plus, and posted occasional content there. However over the last 6 months or so Google has made maximum efforts to make reading it as difficult as possible. I missed having somewhere to post the periodic rants and finds.

Now you can see my fix. This site is built using Nikola which ticks the boxes for simplicity, using Python, and not requiring yet more usernames and passwords, complicated servers and databases, worrying about various attacks, maintenance etc. Under the hood there is jquery and bootstrap, but ironically I write this content even further removed from the web arts than before.

There is even an rss feed which pleases me a lot compared to the Reader shutdown, and walled garden of Google Plus. Anyone can read (and respond thanks to disqus) in virtually any way that works for them.

Doing the conversion was relatively easy but somewhat tedious. Nikola let me keep existing files and do incremental conversions. My resume took a while as the structure wasn't that good. pandoc does a good first pass on other docs.

I also redid all my Google Plus posts here in this blog. Data Takeout let me get the underlying data, but it took many hours of Python coding to munge the data into a reasonably good looking and compatible format.

Category: misc – Tags: nikola


I pick future dead technologies

Article

icon

linux kernel monkey log Here's some thoughts about some hardware I was going to use, hardware I use daily, and hardware I'll probably use someday in the future. ... (more)

It turns out that I am extremely competent at picking future dead technologies when buying Lenovo laptops. My T61 purchase a few years ago included wireless USB. The same wireless USB that failed.

My most recent acquisition last year included Thunderbolt. Well that is dead too.

I wonder what I'll be picking next time...

Category: gplus – Tags: humour


From Google Reader to tt-rss

Screenshot

I switched from Google Reader to tt-rss two weeks ago. It works well and is my permanent solution. That also means I don't have a page permanently open at Google any more, which means far more conscious effort is needed to hit Google Plus which still remains unreadable. The screenshot shows just how many of the two million pixels on my screen are devoted to actual content in yellow. Compare to Reader or tt-rss which come in at close to 100%. I'll probably give up on Plus completely.

On the topic of a Reader replacement, tt-rss pluses are that I can run it on my own server, there are two Android clients (I like the non- official one), feeds are automatically sorted into alphabetical order, the mobile web view is good, there is a well engineered and extensive plugin mechanism, updates are easy, and you can import your Reader feeds and starred articles. There are lots of little pleasant touches here and there. It is fully open source.

On the minus side the UI lags when showing articles are read. The developer says this is because the Javascript rate limits hits to the server, and they don't update the UI until the server has been told. This means it takes up to 15 seconds from when an article is read until the UI updates to show that. They also grey the text instead of a border like Reader. The Reader approach is better UI especially if you are part way through reading the article. The main developer can be quite abrasive, but anyone can fork the project if that turns out to be too much of a problem.

Category: gplus – Tags: google reader, tt-rss

Contact me