Wednesday, 31 January, 2018. 17:36
On the train ride home again from the conference. I woke up early again, got up and had breakfast, then left to catch the train. M.’s plan for the day was to catch the ferry to Sausalito.
Today instead of walking along the road to the conference, I walked out to the pedestrian path that runs along the shore of the bay. This is a much nicer walk, especially in the morning light, as there are hundreds of shorebirds and an expanse of misty water. I saw ducks (mallards and possibly another species), Pacific gulls, curlews, willets, grebes, and a huge white egret.
Birds on the Bay in the morning |
The talks today began with a keynote by Marc Levoy of Google, who talked about pushing the boundaries of low light imaging with phone cameras. Phones don’t have great apertures or zoom capability or sensor sizes, but they do have good low-noise sensors. He showed how you can use this to produce images in really low light conditions by shooting bursts of underexposed images, aligning them, and stacking them to reduce noise. He uses a cool trick to remove hot pixels by acquiring images during the autofocus sweep, which results in everything being blurry… except the hot pixels, so they are easily detected. Alignment is done on very large image patches, hundreds of pixels square, by brute force cross correlation which can be calculated in real time. This allows images to be stacked to produce clear real time video at about 20 frames per second, in moonlight light levels. At even lower light levels, his app can produce usable photos in light too dark to even walk in, around 0.1 lux.
Monkey at the conference |
Next were a couple of talks on HDR photography using image stacks, and solving the ghosting problem. The most promising approach appears to be machine learning, which was used by one speaker to correct artefacts produced by optical flow image alignment. Then there was a talk by a guy from DxOMark about developing quantitative measures for the quality of computational bokeh, which would have been of interest to my own work a few years ago.
After the coffee break was another joint session with Image Quality, which had talks about measuring nose using the dead leaves test pattern, to characterise noise in areas with texture, and measuring quality of auto white balance. Then Dietmar gave a talk about analysing the EXIF data of 8 million photos taken in 2017, which showed some interesting trends compared to a similar analysis he did back in 2008. In particular, a lot more photos are being taken in low light without flash these days.
At lunch, I walked to Burlingame to buy some fresh fruit. I got an apple, orange, and peach. I ate the apple as I walked back to the shore of the bay to find a nice place to sit and eat the other fruit. I found some seats in the shade of trees, facing out to the bay and some groups of nearby birds. I saw something which I didn’t recognise, a bit grebe-like, but with a small head and sharp beak, swimming on the water.
Turns out it was a western grebe, but this was the closest photo I could get with my phone |
20:15
After eating the orange and peach, which were both nice, I walked back towards and then past the conference hotel, to go down the next street to the It’s-It Ice Cream factory shop that I’d discovered last time I was here. I bought a pumpkin ice cream sandwich for just $1.50. There was another guy there before me, ordering a dozen or so of the sandwiches. The man behind the counter asked if he was travelling far with them, presumably to see if he needed dry ice or something, but the guy said he was taking them home, just a few minutes away. The pumpkin ice cream in mine tasted nice, sweet in a vaguely pumpkiny way, but I don’t think I would have identified it by taste.
Pumpkin ice cream sandwich |
Today’s post-lunch plenary was Ronald Azuma talking about augmented reality. He is the guy who coined the term “augmented reality” and wrote the very first paper on the subject. He gave an overview of the technology required and what problems remained to be solved, and then said that the biggest problem was finding an application that would convince mass market consumers to want it and use it. Otherwise it would remain a niche application for things like industry and medicine. He said the main issue was convincing people to want to wear the required headgear. He pointed out that people wear glasses or goggles now when it helps them do a thing they want to do, such as skiing, or swimming, or welding. So if there’s a use for which augmented reality helps people to do something they want, they will adopt it. But nobody knows what that use is yet that will drive it to a market wider than gaming applications.
The final session of the day was a panel discussion on immersive imaging, featuring Brian Cabral, who developed Facebook 360, William Jiang, the guy who developed Lytro Immerge, David Cardinal, photographer, and Timothy Macmillan, the guy who developed timeslice (“bullet time“) photography. They discussed the future of immersive imaging and how they see the field developing, The main issue is that we have the technology in a new state, but nobody knows how to use it yet in a compelling way. It’s like the early years of television, when television shows were all rubbish because they were simply copies of radio plays or stage plays transferred to a new medium. It took a decade for people to understand how to make TV shows and for audiences to understand them. We need to develop a “language” of immersive content, in the same way that film makers use a visual language that audiences understand because they are exposed to it all the time. And again, it will probably take on the order of a decade for the public to become aware of it.
Back in San Francisco |
At the end of the day, I walked back to Millbrae station for the train. Meeting M. in the hotel room shortly after 18:00, we decided to go for dinner soon. We went to King of Thai Noodle House on O’Farrell Street, where we quickly got a table and were served. M. got a vege and tofu stir fry with cashew nuts, chilli, and basil. I wanted the deep fried salmon red curry, but the guy said “duck?”, and I said, “no, salmon”, and he nodded and walked away. And then I ended up getting the duck. We also got a serve of egg rolls as a starter. It was all good, but M. found her dish hotter than she expected. We both ate quickly, because we were hungry. M. had only had a couple of bagels during the day.
Duck curry at King of Thai Noodle House |
After dinner, we went back to Hotel Zetta for another drink. M. got a Cabernet sauvignon, while I tried the “Trees Knees” cocktail, made with gin, herb infused honey, and lemon. We checked the bar menu for anything sweet, and M. found a thing saying “cookies, ask for today’s flavours”. We inquired, and Ursula said there was choc chip, raisin, peanut butter, and oatmeal. One serve was four cookies, and we could mix and match. So I asked for a sampler of all four, and she said it would be about half an hour, as they are cooked to order. This was fine since we weren’t in any hurry. When they arrived they were warm and soft out of the oven, very nice.
The bar at Hotel Zetta |
After finishing our drinks and the cookies we returned next door to our own hotel and showered and read before going to sleep.