Twelve (was Nine) Years Later and No Closer To Shipping: Terrafugia Transition’s Continuously Slipping Delivery Date…

Terrafugia ship date slip chartTerrafugia incorporated in May, 2006 and now nine twelve years later in May, 2015 July, 2018, their Transition aircraft remains the same “two years” away from first customer delivery as when they started.  Considering Terrafugia’s history of delivery estimates, prospective customers should probably toss their most recent estimate in the heap along with all of the other missed estimates that they’ve published.

This list of their most prominent public announcements about expected delivery of the first Transition aircraft to a customer shows no progress toward shipment.  It was compiled in hopes of discerning a trend that might indicate whether the company was closing in on a ship date; they’re not converging.  The estimates show quite the opposite; Terrafugia’s statements have stuck to using a date that’s always about two years from today (average 21.0 months, stdev.p 3.7), regardless of when today actually falls.  October, 2008 represented the high point for customers when Terrafugia proclaimed that they were only 15 months away from starting shipments.  On average since then, Terrafugia has pushed their deadline further into the future and assiduously avoided converging on a real ship date.

Terrafugia ship date slippage

Terrafugia Transition’s Estimated Ship Date moves in lock step with the calendar and they appear to be settling in on a delivery “estimate” that’s consistently “24 months in the future”. If Terrafugia were converging on a ship date, then the orange curve should flatten out and the blue one should curve down toward zero as press releases indicated progress toward an actual shipment.

It appears that either Terrafugia employs really poor forecasters, they’re trying to hide something from the public, or maybe it’s just far harder to bring an airplane to market than newly minted college graduates could have imagined?  Their announcements, whatever the motivation, really push the line between optimism and deception.

Update (later May 11, 2015): Only Terrafugia’s press releases indicate they’re making no progress. Their accomplishments do show advancement, just not at the optimistic pace their press releases would have you believe. They’ve certainly learned a lot from the Proof-Of-Concept and first prototype vehicles as well as from their positive FAA and DOT regulatory decisions.  Hopefully their press officer learns from this post too.

Update (July, 2018):  XKCD has gotten onboard by tracking JWST launch date estimates!

Since delays should get less likely closer to the launch, most astronomers in 2018 believed the expansion of the schedule was slowing, but by early 2020 new measurements indicated that it was actually accelerating.

FWIW, the slope for Terrafugia’s first customer ship estimates is 1.12. Ouch!

Pebble vs. WeLoop Tommy Smart Watch Specification Comparison

WeLoop, in cooperation with DealeXtreme, will release released the US$70 $75 Tommy smart watch on September 15, 2014. This table compares the technical specifications of the Tommy and Pebble smart watches; superior specs are highlighted in bold text.
Edit 2014-10-13: Updated prices
Edit 2014-12-11: WeLoop will release an SDK

Spec Pebble WeLoop Tommy
Price US$100 (Original), US$200 (Steel) US$75
Image Pebble originalPebble Steel WeLoop Tommy smart-watch
Dimensions    52 × 36 × 11.5 mm       46 × 34 × 10.5 mm (Steel) 45 × 34.5 × 11 mm
Case colors (Steel) Stainless steel in brushed silver or matte black;
(Original) Polycarbonate in fly blue, hot pink, fresh green, black, grey, white, orange, or red
Polycarbonate in black or red (white in the future)
Watch band(s) included (Steel) Proprietary watch band connector; stainless steel band in brushed silver or matte black matching watch plus a black leather band.
(Original) Standard 22m watch band connector; color matched 22mm band included: Silicone (black) or TPU (white, blue, pink, green).
Color matched TPU band, proprietary band connector.
CPU Cortex-M3, up to 80MHz Cortex-M0
RAM 128KB SoC 24KB (8KB SoC+16KB external)
Flash memory 4MB (8MB in Steel) external + 512KB SoC 512KB external + 256KB SoC
CPU Power usage  32 µW/MHz 13.36 µW/MHz
Battery 7 days, 130mAh 21 days, 110mAh
App store 1000’s of custom apps and watch faces
(including: pedometers, find my phone, RSS readers, swimming lap coach, home control, remote controls, and much, much more)
None
Update Dec, 2014: WeLoop announced they will release an SDK, date unknown
Common built-in apps Caller-id, accept/reject calls, notifications, multiple analog & digital watch faces, music control
Unique built-in apps Multiple alarms, RunKeeper display Notification filtering, pedometer, find my phone, camera remote
Sensors 3-axis accelerometer, magnetometer, ambient light, thermal 3-axis accelerometer
Crystal Material Scratch resistant polycarbonate (Original) Mineral glass
Gorilla Glass (Steel)
Backlight White Blue
Compatibility iOS & Android
Bluetooth 4.0+LE
Display Sharp 1.26″ Memory LCD, 144×168 monochrome pixels
Output Vibration
Waterproof 5 ATM

Personalize the Meaning of “Retina Display”

Apple’s Retina Display marketing broadly publicized the concept of retinal acuity, but each person’s vision differs; so, just how small do those pixels need to be for your vision?

Fortunately, inverting the well known Snellen notation (e.g. 20/20 corrected vision, 20/30 uncorrected vision, etc…) gives your personal visual acuity in minutes of arc. For example, inverting 20/20 = 1 meaning that 20/20 vision can resolve 1 arc minute sized details.  Similarly, someone with 20/60 vision has a visual acuity of 60/20 = 3.3 arc minutes; 20/15 vision can resolve 15/20 = 0.75 arc minutes.  Go ahead and calculate your own visual acuity in arc minutes.  Ready?

OK, let’s see how tiny the pixels on a screen need to be to make it a retina display for you.  To do this, we’ll calculate the smallest pixels that you can resolve at a given distance. For example, if you have 20/20, or 1 arc minute, vision and hold a smartphone 11 inches (28 cm) away, you’ll be able to resolve individual pixels if there are 313 pixels per inch (123 pixels/cm) or fewer; if it has more pixels than that per inch/cm (i.e. higher pixel density & smaller pixels), then it’s a “retina display”.

Here’s how to calculate the minimum number of pixels per distance to match your eyes (fill in your visual resolution in place of “1“:

tan(½ × 1 arc minute) × 2 × 11 inches = 0.0032 inches (or the inverse of 313 pixels per inch (ppi) or more)
tan(½ × 1 arc minute) × 2 × 28 cm       = 0.00814 cm (or 123 pixels per cm (ppcm) or more)

Spreadsheet formulas for this looks like:

<resolvable pixel> = tan(radians(0.5 * <your arc min.>/60)) * 2 * <distance>
<pixel density> = 1 / <resolvable pixel>

In more detail: to calculate the pixel size, s, opposite the viewer divide the angle, a, in half to give a right triangle with the viewing distance, d, adjacent to the angle and the length of ½ of a pixel opposite. 1 arc minute = 1/60 degree. Then with basic trigonometry:

tangent (angle) = opposite/adjacent
tangent (½ a= ½ s/d
½ s = tan( ½ a ) d
    s = tan( ½ a ) 2 d

Tangent of half of the angle time the distance equals the spacing needed

If you were looking at a television 5-½ feet away instead, then you’d only be able to resolve 52 ppi (20 ppcm):

tan(½ × 1 arc minute) × 2 ×   66 in.  = 0.0192 in. or 52 ppi
tan(½ × 1 arc minute) × 2 × 170 cm = 0.0495 cm or 20 ppcm

A 42-inch diagonal, full HD television (1920×1080) also happens to have 52 pixels per inch; therefore, when viewed from 5-½ feet or farther the pixels begin to blur together for 20/20 vision.  Homework: how close/far should you sit from your television to turn it into a “retina” display? Enjoy!

Snellen acuity Visual resolution (arc minutes) Retina display, iPhone Retina display, TV
(11 in, ppi) (28cm, ppcm) (9′, ppi) (2.75m, ppcm)
20/200 10 31 12 3 1
20/100 5 63 25 6 3
20/70 3.5 89 35 9 4
20/50 2.5 125 49 13 5
20/30 1.5 208 82 21 8
20/20 1 313 123 32 13
20/15 0.75 417 164 42 17

Spaceballs, Comic-Con & nubrella’s new backpack style umbrella

Submitted for your consideration: Nubrella’s backpack style hands-free umbrella (above), and Lord Dark Helmet seen visiting Comic-Con (right). Spaceballs fans everywhere, how cool would it be for them to add a Lord Dark Helmet, blacked-out model to their lineup!

Obfuscated Javascript.Explained()

And this, of course, evaluates to the string “fail”.
(![]+[])[+[]]+(![]+[])[+!+[]]+([![]]+[][[]])[+!+[]+[+[]]]+(![]+[])[!+[]+!+[]];
— Marcus Lagergren (@lagergren) May 23, 2013

Rewrite to show the four (string)[subscript] uses:
 (![]+[])        [+[]]
+(![]+[])        [+!+[]]
+([![]] + [][[]])[+!+[]+[+[]]]
+(![]+[])        [!+[]+!+[]];

Then evaluate each color-coded expression:
![]+[]  => "false"
+[]    => "0"
+!+[]  => "1"
!+[]   => 1

That third string really exercises those square brackets!
[![]] + [][[]] => "falseundefined"
                   01234567890123

Finally, we’re ready to ‘fail’:
[0]+[1]+[10]+[2]
 f + a +  i + l

XKCD Subways of North America – with !

XKCD comic #1196 – Subways has a great map of the subways in North America linked up by some fantastic, mythical branch lines. I started collecting the actual maps for all of the subways, until I recalled that the The Urban Mass Transit Systems of North America map from Yale Professor Bill Rankin on his web site Radical Cartography (ca 2006) has already done the work. That map scales, rotates, and geographically maps the subway systems to allow for accurate comparisons. Mr. Munroe probably got his inspiration from that map!

In case you’re interested, here’s the collection:

Vancouver

Montreal

Boston

San Francisco (BART)

San Francisco (MUNI)

Toronto

Chicago

Cleveland

New York, New Jersey (PATH)

Atlanta

Mexico City

Google Glass’ Retina Display? Almost…

Google Glass Explorer edition

Google Glass’ Explorer hardware

The Explorer version of Google Glass pairs a Himax HX7309 nHD LCOS microdisplay with optics to project a “display [that] is the equivalent of a 25 inch (63.5 cm) [diagonal], high definition screen [viewed] from eight feet away.” and the UI-guidelines say to “target a 640×360 resolution” using a default text size of 40 px.

When viewed through the Explorer’s optics,  display appears like a 25 inch, 16:9 screen.  is 20 inches (51 cm) wide and when viewed from 8 feet (244 cm) away, each pixel subtends just over 1 arcminute in your visual field (67 arcseconds = 3600*degrees(2*atan(20/640/2/(8*12)) ). Typical 20/20 vision has a resolution of about 1 arcminute (60 arcseconds) and so Glass pixels are very close to qualifying as a “retina” display. The whole screen, however, fills just 14 degrees of your visual field. For reference, an iPhone held screen 11 inches (28 cm) away fills 18 degrees (or 20.6 degrees for an iPhone 5).

By the way, the TSA is going to have a field day with sousveillance!

S.H.I.E.L.D’s Helicarrier Could Lift-off Using Today’s Technology

S.H.I.E.L.D Helicarrier from The Avengers movie

Immediately after watching The Avengers movie (on opening day; and yes, it’s been quite a while), I set off, tongue firmly in cheek, to reality-check S.H.I.E.L.D’s Helicarrier. Professor Allain’s Wired blog “Could the S.H.I.E.L.D Helicarrier Fly?” arrives at a different conclusion than I did, so I thought it’s finally time to write up my alternate assessment.

Like the professor, I began by looking into Nimitz class aircraft carriers. Built using HSLA-100 steel, those 333 meter long carriers displace about 100,000 long tons. S.H.I.E.L.D, with flight as a design goal, would clearly upgrade to titanium, employ aircraft construction techniques, and use other advanced methods to lighten their 450 m Helicarrier by about half to 55,000 metric tons (t).

Next, could real fans fit in the space allocated and still generate enough thrust to lift 55 kt? Using the Harriers Alpha Jets on deck as measuring sticks shows that each of the four fans is 51 m in diameter giving a total rotor area of 8,200 m². The Helicarrier name invites comparison with helicopters, so I initially checked its fans against the remarkable Russian Mi-26 heavy-lift helicopter. The Mi-26′ has a maximum takeoff weight of 56 t using a rotor 32 m across to provide a lift/area ratio of 0.07 t/m². That herculean helicopter’s lift/area ratio provides only 1% of what S.H.I.E.L.D needs to fly. Helicopter rotors are out.

Fortunately, engineers have already done better, much better; the Rolls-Royce LiftSystem in the F-35B produces far higher lift/area ratios. For instance, the front LiftFan generates 20,000 lb of lift from a 127 cm (50 in.) fan and the whole system generates 19 t (41,900 lb) of lift from 2.51 m² of duct area (LiftFan, jet exhaust, and roll-posts). Its 7.56 t/m² ratio leaps two orders of magnitude past the Mi-26 and provides plenty of lift-off thrust for S.H.I.E.L.D’s headquarters. To sustain a single rotor failure, the Helicarrier engineers must have improved on Rolls-Royce by at least 14% to 8.61 t/m² or more. Nonetheless, the Helicarrier appears to fly within the laws of physics here.

Spinning those fans requires an enormous amount of power and generating it presents an even bigger engineering problem, but it hovers (barely) within the realm of possibility. The F35-B’s thrust ratio (55,000 shp delivering 41,900 lb thrust) implies the Helicarrier carries engine(s) capable of 157 million horsepower (shp) or ~117 gigawatts output (that’s more power than all of the nuclear power plants in the USA combined). Any power source would need to be scaled up, but allocating 20% of the carrier’s gross tonnage to the power plant sets the minimum power density at 10.8 kW/kg; for comparison, here’s a quick rundown of some real-world power densities:

  • A Pratt & Whitney F135 jet turbine in the F35B produces 38 kW/kg
  • The RS-25D Space Shuttle Main Engine delivers the highest power to weight ratio available today at 1445 kW/kg
  • For something closer to home, the LS9 supercharged V8 in a Corvette C6 ZR1 produces 638 HP and weighs 530 lb (dry) for a power to weight ratio of 0.9 kW/kg
  • The Indy Car Series 2012- engine weighs just 112.5 kg (248 lb) and puts out 522 kW (700 HP) 4.46 kW/kg or 5× the specific power of a Corvette but just half of what’s needed
  • A Nimitz A4W nuclear reactor weighs a lot more but at least it contains enough fuel for two decades; it generates 0.05 kW/kg with older, 18% efficienct steam turbines
  • The Los Alamos Heat Pipe reactor (alpha of 0.43) combined with 40%+ efficiency turbines raises its specific power to 0.8 kW/kg

Unlike the Nimitz aircraft carrier, nuclear reactors, even lightweight research reactors, won’t work. S.H.I.E.L.D would need around 530 A4W reactors to generate 117 GW and those would weigh over 2 million metric tons; that wouldn’t just ground the Helicarrier, it would sink it!

Shuttle main engines would weigh 81 t or 0.15% of the total. Using the rest of the weight allocation for cryogenic fuel provides 2 hours of flight time in a tank 111 m long and 11 m in diameter.

Turbines have high enough specific power but need to be scaled up a lot; just ganging together 3,000 P&W F135 turbines would be an engineering and maintenance nightmare! Real-world engines have high enough specific power for the Helicarrier, but scaling them up would be incredibly difficult.

For in-world consistency, I also had to consider Tony Stark’s imaginary arc reactors. His Mark III could generate 3 gigawatts (3×109 watts) and the Mark IV upgrade quadrupled that to 12 gigawatts (12×109 watts). Stark could easily pickup either version by hand and so I’ll assume a maximum weight of 50 kg. That’s 240,000 kW/kg! and a specific power 1,600 times greater than the Shuttle turbopump! Using an exotic power source like that obviates the need for anything fancy; just scale up a standard AC induction motor and watch it fly!

Improve VR Resolution Using Subpixel Rendering without Color Filters in Periphery

Visual Field of the Naked Eye, section 3.1 of Howlett, SID 1992.

Wide angle virtual reality (VR) displays need to wrap around up to 280° horizontally and  120° veritically to cover the human visual field¹;  within that area, eye motion can swivel the fovea over a circular area approximately 90° horizontally and vertically. That’s a lot of screen real estate! For comparison, THX recommends sitting just 6.5 feet (2m) from your new 65-inch HDTV but that still only fills 40° of view.

VR head mounted displays (HMDs) typically use lenses to magnify the screen so that it fills up more of our visual field. At high magnification though, the pixels get very large and image quality fades. A standard way to improve quality works with our eyes’ natural limits; the farther things are from the center of vision, the less clearly we see them.  Fisheye lens magnifiers work well with this limitation; they stretch the center part of the image less where we see better and magnify the outside areas more where we don’t see as well.

Another way to get more quality out of the image is to work with another limit of our peripheral vision.  Outside of the foveal sweep (aka the 90° direct field where we see sharply) our eyes do not perceive color.  Our brains keep track of color for us in the periphery, but our eyes don’t actually see that color; it’s literally all in our head. Wide angle VR display systems, therefore, can use sub-pixel rendering without applying a low-pass color filter; chromatic aliasing will not be visible for pixel triads in the periphery; this triples horizontal resolution outside of the foveal area for standard LCD stripe arrays.

Sub-pixel rendering without color filtering outside of the direct field can be used either to push more pixels into the direct field or to extend the peripheral field.  HMDs using “large expanse, extra perspective” (LEEP) optics, for example, can increase the fisheye magnification on the edges of the display and further improve the projected visual field. The constant k can be increased from the typical LEEP value of 0.18 to even higher levels of magnification.

LEEP optical compression. r = F (Θ – .18 Θ³) ≈ F sin Θ

If the HMD designer has input into the LCD design, additional resolution improvements can be incorporated by specifying a unique LCD color filter arrangement.  In the direct field portion of the LCD viewable by the fovea, individual sub-pixels should be colored in a Bayer pattern that better  mimics the physiology of the human eye.  Outside of the direct field, the LCD color filters should be eliminated altogether to improve the range of brightness and simplify sub-pixel rendering.

Posted in Fun. Tags: , , . Leave a Comment »

Digital vs. Analog Photography

Film still has much higher resolution, a wider color gamut, and greater dynamic range than digital sensors; however, the convenience, instant feedback, and cost savings offered by digital photos and video will eventually confine analog film to niche uses. Today’s (2012) consumer level digital resolutions already capture images that exceed most people’s visual acuity for small formats (e.g. Apple’s “Retina” displays, 5″x7″ prints), but feature movies are still shot using 35-mm film to safely scale to large theater screens.

Ken Rockwell’s sage take on analog vs. digital in 2002 remains true today:

Convenience has always won out over ultimate quality throughout the history of photography. Huge home-made wet glass plates led to store-bought dry plates which led to 8 x 10″ sheet film which led to 4 x 5″ sheet film which led to 2-1/4″ roll film which led to 35mm which led to digital. As the years roll on the ultimate quality obtained in each smaller medium drops, while the average results obtained by everyone climbs. In 1860 only a few skilled artisans like my great-great-great grandfather in Scotland could coax any sort of an image at all from a plate camera while normal people couldn’t even take photos at all. In 1940 normal people got fuzzy snaps from their Brownies and flashbulbs while artists got incredible results on 8 x 10″ film. Today artists still mess with 4 x 5″ cameras and normal people are getting the best photos they ever have on 3 MP digital cameras printed at the local photo lab.

Most of the “digital vs. film” essays on the internet actually compare digitized scans of film against directly captured digital images and have the implicit goal of justifying a professional photographer’s expensive digital camera purchase. Unfortunately, the scanner usually limits resolution on the film side but rarely receives reviewer attention!

When critiquing articles, watch for comparisons that use a microscope to examine the film and look for discussions about the overall workflows’ impact on each imaging system. Only recently has lens MTF testing and discussion been revived for digital photography; film buffs in the 70’s and 80’s regularly read optical lab reports comparing lenses. This expansion of the conversation shows that high-end camera sensors have finally achieved resolutions that film had in the 70’s; the sensors have finally reached the point that lens quality can once again affect overall image quality. Despite Kodak’s financial difficulties, their research labs have continued improving film and digital still has ground to cover before it reaches the absolute resolutions available on film. 35-mm movies could deliver even higher resolution, if they needed to, by using larger formats; for instance, VistaVision exposes twice as much negative area (8-perf, horizontal frames exposed in the same way that a 35-mm still camera exposes them), but they don’t need to – other costs and limitations in the workflow are more important.

Color gamut, frame rate, and dynamic range remain problematic for digital imaging too. HDR algorithms and better sensors have only started addressing these problems. Panavision’s John Galt provides some good detail in “The Truth About 2K, 4K and The Future of Pixels” where he advocates for higher frame rates as the quickest way to improve perceived resolution.

My conclusion?  While film is technically superior, none of this really matters for me yet; the creative input of the photographer/director dominates the quality of the result. Even an iPhone, in the hands of an expert photographer, can outperform any camera in the hands of an amateur.  Instead of investing in increasingly higher resolution cameras or reverting to film, I’m heading to the library to improve the equipment between my ears!

Posted in Fun. 6 Comments »