Gadgets – TechCrunch Devin Coldewey

The complex optics involved with putting a screen an inch away from the eye in VR headsets could make for smartglasses that correct for vision problems. These prototype “autofocals” from Stanford researchers use depth sensing and gaze tracking to bring the world into focus when someone lacks the ability to do it on their own.

I talked with lead researcher Nitish Padmanaban at SIGGRAPH in Vancouver, where he and the others on his team were showing off the latest version of the system. It’s meant, he explained, to be a better solution to the problem of presbyopia, which is basically when your eyes refuse to focus on close-up objects. It happens to millions of people as they age, even people with otherwise excellent vision.

There are, of course, bifocals and progressive lenses that bend light in such a way as to bring such objects into focus — purely optical solutions, and cheap as well, but inflexible, and they only provide a small “viewport” through which to view the world. And there are adjustable-lens glasses as well, but must be adjusted slowly and manually with a dial on the side. What if you could make the whole lens change shape automatically, depending on the user’s need, in real time?

That’s what Padmanaban and colleagues Robert Konrad and Gordon Wetzstein are working on, and although the current prototype is obviously far too bulky and limited for actual deployment, the concept seems totally sound.

Padmanaban previously worked in VR, and mentioned what’s called the convergence-accommodation problem. Basically, the way that we see changes in real life when we move and refocus our eyes from far to near doesn’t happen properly (if at all) in VR, and that can produce pain and nausea. Having lenses that automatically adjust based on where you’re looking would be useful there — and indeed some VR developers were showing off just that only 10 feet away. But it could also apply to people who are unable to focus on nearby objects in the real world, Padmanaban thought.

This is an old prototype, but you get the idea.

It works like this. A depth sensor on the glasses collects a basic view of the scene in front of the person: a newspaper is 14 inches away, a table three feet away, the rest of the room considerably more. Then an eye-tracking system checks where the user is currently looking and cross-references that with the depth map.

Having been equipped with the specifics of the user’s vision problem, for instance that they have trouble focusing on objects closer than 20 inches away, the apparatus can then make an intelligent decision as to whether and how to adjust the lenses of the glasses.

In the case above, if the user was looking at the table or the rest of the room, the glasses will assume whatever normal correction the person requires to see — perhaps none. But if they change their gaze to focus on the paper, the glasses immediately adjust the lenses (perhaps independently per eye) to bring that object into focus in a way that doesn’t strain the person’s eyes.

The whole process of checking the gaze, depth of the selected object and adjustment of the lenses takes a total of about 150 milliseconds. That’s long enough that the user might notice it happens, but the whole process of redirecting and refocusing one’s gaze takes perhaps three or four times that long — so the changes in the device will be complete by the time the user’s eyes would normally be at rest again.

“Even with an early prototype, the Autofocals are comparable to and sometimes better than traditional correction,” reads a short summary of the research published for SIGGRAPH. “Furthermore, the ‘natural’ operation of the Autofocals makes them usable on first wear.”

The team is currently conducting tests to measure more quantitatively the improvements derived from this system, and test for any possible ill effects, glitches or other complaints. They’re a long way from commercialization, but Padmanaban suggested that some manufacturers are already looking into this type of method and despite its early stage, it’s highly promising. We can expect to hear more from them when the full paper is published.

Gadgets – TechCrunch Natasha Lomas

Elvie, a femtech hardware startup whose first product is a sleek smart pelvic floor exerciser, has inked a strategic partnership with the UK’s National Health Service that will make the device available nationwide through the country’s free-at-the-point-of-use healthcare service so at no direct cost to the patient.

It’s a major win for the startup that was co-founded in 2013 by CEO Tania Boler and Jawbone founder, Alexander Asseily, with the aim of building smart technology that focuses on women’s issues — an overlooked and underserved category in the gadget space.

Boler’s background before starting Elvie (née Chiaro) including working for the U.N. on global sex education curriculums. But her interest in pelvic floor health, and the inspiration for starting Elvie, began after she had a baby herself and found there was more support for women in France than the U.K. when it came to taking care of their bodies after giving birth.

With the NHS partnership, which is the startup’s first national reimbursement partnership (and therefore, as a spokeswoman puts it, has “the potential to be transformative” for the still young company), Elvie is emphasizing the opportunity for its connected tech to help reduce symptoms of urinary incontinence, including those suffered by new mums or in cases of stress-related urinary incontinence.

The Elvie kegel trainer is designed to make pelvic floor exercising fun and easy for women, with real-time feedback delivered via an app that also gamifies the activity, guiding users through exercises intended to strengthen their pelvic floor and thus help reduce urinary incontinence symptoms. The device can also alert users when they are contracting incorrectly.

Elvie cites research suggesting the NHS spends £233M annually on incontinence, claiming also that around a third of women and up to 70% of expectant and new mums currently suffer from urinary incontinence. In 70 per cent of stress urinary incontinence cases it suggests symptoms can be reduced or eliminated via pelvic floor muscle training.

And while there’s no absolute need for any device to perform the necessary muscle contractions to strengthen the pelvic floor, the challenge the Elvie Trainer is intended to help with is it can be difficult for women to know they are performing the exercises correctly or effectively.

Elvie cites a 2004 study that suggests around a third of women can’t exercise their pelvic floor correctly with written or verbal instruction alone. Whereas it says that biofeedback devices (generally, rather than the Elvie Trainer specifically) have been proven to increase success rates of pelvic floor training programmes by 10% — which it says other studies have suggested can lower surgery rates by 50% and reduce treatment costs by £424 per patient head within the first year.

“Until now, biofeedback pelvic floor training devices have only been available through the NHS for at-home use on loan from the patient’s hospital, with patient allocation dependent upon demand. Elvie Trainer will be the first at-home biofeedback device available on the NHS for patients to keep, which will support long-term motivation,” it adds.

Commenting in a statement, Clare Pacey, a specialist women’s health physiotherapist at Kings College Hospital, said: “I am delighted that Elvie Trainer is now available via the NHS. Apart from the fact that it is a sleek, discreet and beautiful product, the app is simple to use and immediate visual feedback directly to your phone screen can be extremely rewarding and motivating. It helps to make pelvic floor rehabilitation fun, which is essential in order to be maintained.”

Elvie is not disclosing commercial details of the NHS partnership but a spokeswoman told us the main objective for this strategic partnership is to broaden access to Elvie Trainer, adding: “The wholesale pricing reflects that.”

Discussing the structure of the supply arrangement, she said Elvie is working with Eurosurgical as its delivery partner — a distributor she said has “decades of experience supplying products to the NHS”.

“The approach will vary by Trust, regarding whether a unit is ordered for a particular patient or whether a small stock will be held so a unit may be provided to a patient within the session in which the need is established. This process will be monitored and reviewed to determine the most efficient and economic distribution method for the NHS Supply Chain,” she added.

Gadgets – TechCrunch Natasha Lomas

It’s been a long and trip-filled wait but mixed reality headgear maker Magic Leap will finally, finally be shipping its first piece of hardware this summer.

We were still waiting on the price-tag — but it’s just been officially revealed: The developer-focused Magic Leap One ‘creator edition’ headset will set you back at least $2,295. So a considerable chunk of change — albeit this bit of kit is not intended as a mass market consumer device but is an AR headset for developers to create content that could excite future consumers.

The augmented reality startup, which has raised at least $2.3 billion, according to Crunchbase, attracting a string of high profile investors including Google, Alibaba, Andreessen Horowitz and others, is only offering its first piece of reality bending eyewear to “creators in cities across the contiguous U.S.”.

Potential buyers are asked to input their zip code via its website to check if it will agree to take their money but it adds that “the list is growing daily”.

We tried the TC SF office zip and — unsurprisingly — got an affirmative of delivery there. But any folks in, for example, Hawaii wanting to spend big to space out are out of luck for now…

Magic Leap specifies it will “hand deliver” the package to buyers — and “personally get you set up”.

So evidently it wants to try to make sure its first flush of expensive hardware doesn’t get sucked down the toilet of dashed developer expectations.

It describes the computing paradigm it’s seeking to shift, with the help of enthused developers and content creators, as “spatial computing” — but it really needs a whole crowd of technical and creative people to step with it if it’s going to successfully deliver that.

 

Gadgets – TechCrunch Matt Burns

Fitbit’s stock price jumped in after-hours trading and is currently trading around $6.00 a share, off its 52-week intraday high of $7.79.

The company today announced its latest quarterly numbers, which saw the average selling price of its wearables increase 6 percent year-over-year to $106 a device. New devices introduced within the last year represented 59 percent of the company’s revenue.

Smartwatches were a high-point for Fitbit this quarter. The company stated that its higher-priced smartwatch wearables outsold Samsung, Garmin and Fossil smartwatches combined in North America. Smartwatch revenue grew to 55 percent of revenue, up from 30 percent on a sequential basis.

“Our performance in Q2 represents the sixth consecutive quarter that we have delivered on our financial commitments, made important progress in transforming our business, and continued to adapt to the changing wearables market. Demand for Versa, our first ‘mass-appeal’ smartwatch, is very strong. Within the second quarter, Versa outsold Samsung, Garmin and Fossil smartwatches combined in North America, improving our position with retailers, solidifying shelf space for the Fitbit brand and providing a halo effect to our other product offerings,” said James Park, co-founder and CEO.

Fitbit’s stock price rallied earlier this summer, hitting 7.79 — its highest selling price since early 2017. The stock has been slipping since, though this quarterly release could cause the price to jump again.

Gadgets – TechCrunch Greg Kumparak

Did you know Segway is making a pair of self-balancing roller shoes? It is! The company has been tinkering with all sorts of new form factors since it was acquired by Ninebot in 2015, from half-sized Segways to kick scooters. Next up: inline… shoe… platform things.

Called the Segway Drift W1s, they sorta look like what would happen if you took a hoverboard (as in the trendy 2016 hoverboard-that-doesn’t-actually-hover “hover”board, not Marty McFly’s hoverboard), split it in two and plopped one half under each foot.

It released a video demonstrating the shoes a few weeks back. Just watching it makes me feel like I’ve bruised my tailbone, because I’m clumsy as hell.

Pricing and availability was kept under wraps at the time, but the company has just released the details: a pair will cost you $399, and ship sometime in August. Oh, and they’ll come with a free helmet, because you’ll probably want to wear a helmet.

A new product page also sheds some light on a few other previously undisclosed details: each unit will weigh about 7.7lbs, and top out at 7.5 miles per hour. Riding time “depends on riding style and terrain,” but the company estimates about 45 minutes of riding per charge.

I look forward to trying these — then realizing I have absolutely no idea how to jump off and just riding forever into the sunset.

Gadgets – TechCrunch Greg Kumparak

Did you know Segway is making a pair of self-balancing roller shoes? It is! The company has been tinkering with all sorts of new form factors since it was acquired by Ninebot in 2015, from half-sized Segways to kick scooters. Next up: inline… shoe… platform things.

Called the Segway Drift W1s, they sorta look like what would happen if you took a hoverboard (as in the trendy 2016 hoverboard-that-doesn’t-actually-hover “hover”board, not Marty McFly’s hoverboard), split it in two and plopped one half under each foot.

It released a video demonstrating the shoes a few weeks back. Just watching it makes me feel like I’ve bruised my tailbone, because I’m clumsy as hell.

Pricing and availability was kept under wraps at the time, but the company has just released the details: a pair will cost you $399, and ship sometime in August. Oh, and they’ll come with a free helmet, because you’ll probably want to wear a helmet.

A new product page also sheds some light on a few other previously undisclosed details: each unit will weigh about 7.7lbs, and top out at 7.5 miles per hour. Riding time “depends on riding style and terrain,” but the company estimates about 45 minutes of riding per charge.

I look forward to trying these — then realizing I have absolutely no idea how to jump off and just riding forever into the sunset.

Gadgets – TechCrunch Matt Burns

Welcome to Bag Week 2018. Every year your faithful friends at TechCrunch spend an entire week looking at bags. Why? Because bags — often ignored but full of our important electronics — are the outward representations of our techie styles, and we put far too little thought into where we keep our most prized possessions.

The Osprey Momentum 32 impresses. I used it during a muddy week at Beaumont Scout Reservation and it performed flawlessly as a rugged, bike-ready backpack. It stood tall in the miserable rain and insufferable heat that engulfed northern Ohio during the camping trip. If it can withstand these conditions, it can withstand an urban commute.

For those following along, Bag Week 2018 ended a week ago. That’s okay. Consider this as bonus content. Before publishing a review on this bag, I wanted to test it during a camping trip, and last week’s trip provided a great testing ground for this bag.

[gallery ids="1667188,1667180,1667181,1667189,1667187,1667185,1667183,1667184"]

Osprey markets the Momentum 32 as an everyday pack with a tilt toward bicyclists. There’s a clip on the outside to hold a bike helmet and a large pocket at the bottom to store bicycle shoes — or just another pair of shoes. The back panel features great ventilation and the shoulder straps have extra give to them thanks to integrated elastic bands.

It’s the ventilated back panel that makes the pack stand out to me. It’s ventilated to an extreme. Look at me. I’m in my mid-thirties and on a quest to visit all of Michigan’s craft breweries. I sweat and it was hot during my time with this bag. This bag went a long way in helping to keep the sweat under control — much more so than any other commuter bag I’ve used.

There was never a time when I was using this bag that I felt like a sweaty dad, even though the temp reached into the 90s. I appreciate that.

The internal storage is sufficient. There’s a good amount of pockets for gadgets and documents. There’s even a large pocket at the bottom to store a pair of shoes and keep them separated from the rest of the bag’s contents. As any good commuter bag, it has a key chain on a retractable cord so you can get access to your keys without detaching them from the bag.

The bag also has a rain cover, which saved me in several surprise rain showers. The rain cover itself is nothing special; a lot of bags have similar covers. This cover is just part of a winning formula used on this bag.

The Osprey Momentum is a fantastic bag. It stands apart from other bags with extreme ventilation on the back panel and features cyclist and commuters will appreciate.

bag week 2018

Gadgets – TechCrunch Devin Coldewey

Devices like smartphones ought to help people feel safer, but if you’re in real danger the last thing you want to do is pull out your phone, go to your recent contacts and type out a message asking a friend for help. The Women’s Safety XPRIZE just awarded its $1 million prize to one of dozens of companies attempting to make a safety wearable that’s simple and affordable.

The official challenge was to create a device costing less than $40 that can “autonomously and inconspicuously trigger an emergency alert while transmitting information to a network of community responders, all within 90 seconds.”

Anu and Naveen Jain, the entrepreneurs who funded the competition, emphasized the international and very present danger of sexual assault in particular.

“Women’s safety is not just a third world problem; we face it every day in our own country and on our college campuses,” said Naveen Jain in the press release announcing the winner. “It’s not a red state problem or a blue state problem but a national problem.”

“Safety is a fundamental human right and shouldn’t be considered a luxury for women. It is the foundation in achieving gender equality,” added Anu Jain.

Out of dozens of teams that entered, five finalists were chosen in April: Artemis, Leaf Wearables, Nimb & SafeTrek, Saffron and Soterra. All had some variation on a device that either detected or was manually activated during an attack or stressful situation, alerting friends to one’s location.

The winner was Leaf, which had the advantage of having already shipped a product along these lines, the Safer pendant. Like any other Bluetooth accessory, it keeps in touch with your smartphone wirelessly and when you press the button twice your emergency contacts are alerted to your location and need for help. It also records audio, possibly providing evidence later or a deterrent to harassers who might fear being identified.

It’s not that it’s an original idea — we’ve had various versions of this for some time, and even covered one of the other finalists last year. But they haven’t been quantitatively evaluated or given a platform like this.

“These devices were tested in many conditions by the judges to ensure that they will work in real-life cases where women face dangers today. They were tested in no-connectivity areas, on public transit, in basements of buildings, among other environments,” explained Anu Jain to TechCrunch. “Having the capability to record audio after sending the alert was one of the main differentiators for Leaf Wearables. Their chip design and software was also easy to be integrated into other accessories.”

Hopefully the million dollars and the visibility from winning the prize will help Leaf get its product out to people who need it. The runners-up don’t seem likely to give up on the problem, either. And it seems like the devices will only get better and cheaper — not that this will change the world on its own.

“Prices will come down as the sensor prices drop. In many countries it will require community support to be built,” continued Jain. “These technologies can act as a deterrent but in the long term culture of violence again women must change.”

Gadgets – TechCrunch Devin Coldewey

Microsoft’s HoloLens has an impressive ability to quickly sense its surroundings, but limiting it to displaying emails or game characters on them would show a lack of creativity. New research shows that it works quite well as a visual prosthesis for the vision impaired, not relaying actual visual data but guiding them in real time with audio cues and instructions.

The researchers, from CalTech and University of Southern California, first argue that restoring vision is at present simply not a realistic goal, but that replacing the perception portion of vision isn’t necessary to replicate the practical portion. After all, if you can tell where a chair is, you don’t need to see it to avoid it, right?

Crunching visual data and producing a map of high-level features like walls, obstacles, and doors is one of the core capabilities of the HoloLens, so the team decided to to let it do its thing and recreate the environment for the user from these extracted features.

They designed the system around sound, naturally. Every major object and feature can tell the user where it is, either via voice or sound. Walls, for instance, hiss (presumably a white noise, not a snake hiss) as the user approaches them. And the user can scan the scene, with objects announcing themselves from left to right from the direction in which they are located. A single object can be selected and will repeat its callout to help the user find it.

That’s all well for stationary tasks like finding your cane or the couch in a friend’s house. But the system also works in motion.

The team recruited seven blind people to test it out. They were given a brief intro but no training, and then asked to accomplish a variety of tasks. The users could reliably locate and point to objects from audio cues, and were able to find a chair in a room in a fraction of the time they normally would, and avoid obstacles easily as well.

This render shows the actual paths taken by the users in the navigation tests.

Then they were tasked with navigating from the entrance of a building to a room on the second floor by following the headset’s instructions. A “virtual guide” repeatedly says “follow me” from an apparent distance of a few feet ahead, while also warning when stairs were coming, where handrails were, and when the user had gone off course.

All seven users got to their destinations on the first try, and much more quickly than if they had had to proceed normally with no navigation. One subject, the paper notes, said “That was fun! When can I get one?”

Microsoft actually looked into something like this years ago, but the hardware just wasn’t there — HoloLens changes that. Even though it is clearly intended for use by sighted people, its capabilities naturally fill the requirements for a visual prosthesis like the one described here.

Interestingly, the researchers point out that this type of system was also predicted more than 30 years ago, long before they were even close to possible:

“I strongly believe that we should take a more sophisticated approach, utilizing the power of artificial intelligence for processing large amounts of detailed visual information in order to substitute for the missing functions of the eye and much of the visual pre-processing performed by the brain,” wrote the clearly far-sighted C.C. Collins way back in 1985.

The potential for a system like this is huge, but this is just a prototype. As systems like HoloLens get lighter and more powerful, they’ll go from lab-bound oddities to everyday items — one can imagine the front desk at a hotel or mall stocking a few to give to vision-impaired folks who need to find their room or a certain store.

“By this point we expect that the reader already has proposals in mind for enhancing the cognitive prosthesis,” they write. “A hardware/software platform is now available to rapidly implement those ideas and test them with human subjects. We hope that this will inspire developments to enhance perception for both blind and sighted people, using augmented auditory reality to communicate things that we cannot see.”

Gadgets – TechCrunch John Biggs

In my endless quest to get geeks interested in watches I present to you the Bell & Ross BR V2-93 GMT 24H, a new GMT watch from one of my favorite manufacturers that is a great departure from the company’s traditional designs.

The watch is a 41mm round GMT, which means it has three hands to show the time in the 12-hour scale and another separate hand that shows the time in a 24-hour scale. You can use it to see time zones in two or even three places and it comes in a nice satin-brushed metal case with a rubber or metal strap.

B&R is unique because it’s one of the first companies to embrace online sales after selling primarily in watch stores for about a decade. This means the watches are slightly cheaper — this one is $3,500 — and jewelers can’t really jack up the prices in stores. Further, B&R has a great legacy of making legible, usable watches, and this one is no exception. It is also a fascinating addition to the line. B&R has an Instrument series, which consists of large, square watches with huge numerals, and a Vintage series that hearkens back to WWII-inspired, smaller watches. This one sits firmly in the middle, taking on the clear lines of the Instrument inside a more vintage case.

Ultimately watches like this one are nice tool watches — designed for legibility and usability above fashion. It’s a nice addition to the line and looks like something a proper geek could wear in lieu of Apple Watches and other nerd jewelry. Here’s hoping.