Connect with us

Gadgets

Apple could reportedly add TrueDepth features to the rear iPhone camera

Published

on


Apple has packed a ton of sensors into the notch of the iPhone X. The TrueDepth camera system powers many features, from Face ID to face-tracking technology using ARKit. It also lets you do selfies in Portrait mode. According to a new Bloomberg report, Apple now wants to improve the rear camera with 3D-sensing capabilities.

Bloomberg says that Apple won’t use the exact same technology in the rear sensor. The iPhone X currently projects a grid of thousands of laser dots and look at the distorsion of those dots on your face.

This rumored 3D sensor would project laser dots and calculate how long it takes to bounce back and come back to the phone. Apple currently uses two cameras to understand what’s closer to you.

But this new system would be much more accurate. Your phone could understand your surrounding and create a rough 3D map of your environment. It would be quite useful for ARKit and other augmented reality features.

Apple is getting serious about augmented reality — Bloomberg also reported that the company has been working on an AR headset. Adding new AR capabilities to the iPhone could be a great way to convince people to buy an AR headset in a few years.

It reminds me of Google’s Project Tango. It never really took off and Google pivoted to ARCore. But the idea behind Project Tango might live on with Apple’s upcoming phones. Sensors and chips could be cheap and small enough for a thin flagship smartphone.

Bloomberg says this new 3D sensor won’t be ready for next year’s iPhone. It could ship in 2019.
Readmore

Europe

Let’s talk about commercial drones at TechCrunch Disrupt Berlin

Published

on


Everyone loves drones, including companies with very specific needs. Drones once were the hot new thing and the perfect birthday gift. But drone makers are now realizing that there’s a bigger opportunity with commercial use cases, from farming to inspection. That’s why we’re excited to announce that three founders of three amazing companies in the drone industry will join us on stage at TechCrunch Disrupt Berlin on December 4-5, 2017.

As we already announced, Henri Seydoux from Parrot is going to tell us about his company’s shift. Parrot has been a pioneer in the drone industry. The company took advantage of the accelerometers, gyroscopes, wireless chips and energy-efficient processors that you can find in smartphones to power tiny quadcopters.

But Parrot has also made multiple acquisitions for its commercial drone division. SenseFly, Airinov, MicaSense and Pix4D are now all owned by Parrot. The French company also owns multiple patents on drone technologies and sells integrated software and hardware solutions for firefighters, farmers and more.

Second, James Harrison from Sky-Futures has been building software solutions for drone inspections. The company first focused on the oil and gas industry. When you think about it, it’s so much easier to fly a drone around an offshore oil drilling platform to see if everything works fine.

Other industries also have a hard time keeping an eye on inaccessible sites. If you’re maintaining bridges, keeping an eye on wind turbines or overseeing construction sites, drones can be much cheaper than human inspection. Sky-Futures lets you record, log and share all your observations.

Finally, Clément Christomanos from Uavia has been working on remote aerial inspection. If you’ve ever played with a drone, you know that they have limited battery life and that you need to stay in range.

Uavia thinks this can be an issue if you need to inspect multiple sites around the country. You either need to train people on the ground or send drone pilots. With Uavia, you can control a drone thousands of miles away from your web browser.

Drones receive instructions from traditional cell towers and then go back to charging stations when they’re done. This can replace or supplement surveillance cameras in sensitive areas.

All those use cases are just the tip of the iceberg. Goldman Sachs recently shared a report on commercial drones. There are more than a dozen industries that could greatly benefit from using drones.

Get your Disrupt tickets right now to see the founders of Parrot, Sky-Futures and Uavia tell you everything about the future of drones. You’ll also see the Startup Battlefield competition, in which a handful of startups pitch our judges with the hopes of winning the coveted Disrupt Cup and a cash prize.

You’ll get to chat with plenty of promising startups in Startup Alley, see amazing talks on the main stage, and unwind after a long day at the show with a cocktail and some new friends at the Disrupt after party.

Do you run a startup? The Startup Alley Exhibitor Package is your best bet to get the greatest exposure by exhibiting your company or product directly on the Disrupt Berlin show floor.
Readmore

Continue Reading

Gadgets

Apple’s Thanksgiving ad is mostly about the AirPods

Published

on


Every year, Apple airs a new ad in the U.S. for Thanksgiving. Compared to other Apple ads, this is less about showing product features and more like a greeting card.

This year is no different — you still see a lot of AirPods. Apple’s new ad is called “Sway” and takes place in the streets of New York. A woman starts playing Sam Smith’s “Palace” on her white iPhone X with her AirPods.

She then ends up in an alternate reality where she can dance around people without getting noticed. She bumps into a man, hands him an AirPod and starts dancing with him under the snow.

Fun fact: these two dancers are married in real life.


Readmore

Continue Reading

Europe

Snips lets you build your own voice assistant to embed into your devices

Published

on


French startup Snips is now helping you build a custom voice assistant for your device. Snips doesn’t use Amazon’s Alexa Voice Service or Google Assistant SDK — the company is building its own voice assistant so that you can embed it on your devices. And the best part is that it doesn’t send anything to the cloud as it works offline.

If you want to understand how a voice assistant works, you can split it into multiple parts. First, it starts with a wakeword. Snips has a handful of wakewords by default, such as “Hey Snips,” but you can also pay the company to create your own wakeword.

For instance, if you’re building a multimedia robot called Keecker, you can create a custom “Hey Keecker” hot word. Snips then uses deep learning to accurately detect when someone is trying to talk to your voice assistant.

The second part is automatic speech recognition. A voice assistant transcribes your voice into a text query. Popular home assistants usually send a small audio file with your voice and use servers to transcribe your query.

Snips can transcribe your voice into text on the device itself. It works on anything that is more powerful than a Raspberry Pi. For now, Snips is limited to English and French. You’ll have to use a third-party automatic speech recognition API for other languages.

Then, Snips needs to understand your query. The company has developed natural language capabilities. But there are hundreds, or even thousands of different ways to ask a simple question about the weather for instance.

That’s why Snips is launching a data generation service today. I saw a demo yesterday, and the interface looks like Automator on macOS or Workflow on iOS. You define some variables, such as “date” and “location”, you define if they are mandatory for the query and you enter a few examples.

But instead of manually entering hundreds of variations of the same query, you can pay $100 to $800 to let Snips do the work for you. The startup manually checks your request then posts it on Amazon Mechanical Turk and other crowdsourcing marketplaces. Finally, Snips cleans up your data set and sends it back to you.

You can either download it and reuse it in another chatbot or voice assistant, or you can use it with Snips’ own voice assistant. You can also make your capability public. Other Snips users can add this capability to their own assistant by browsing a repository of pre-trained capabilities.

  1. Step 1. Create Intent

  2. Step 2. Choose datagen package

  3. Step 3. Confirm results

A Snips voice assistant typically requires hundreds of megabytes but is quite easy to update. After installing the Snips app on your device, you just need to replace a zip library file to add new capabilities.

You also need to implement the actual actions. Snips only translates what someone is saying into a parsable query. For instance, Snips can understand that “could you please turn on the bedroom light?” means “light + bedroom + on.” A developer still needs to implement the action based on those three parameters.

Developers are already playing with Snips to test its capabilities. But the company hopes that big device manufacturers are going to embed Snips into their future products. Eventually, you could think about a coffee maker with a Snips voice assistant.

Compared to Amazon’s or Google’s wide-ranging assistants, Snips thinks that you don’t need to embed a complete voice assistant into all your devices. You only want to tell your Roomba to start vacuuming — no need to let you start a Spotify playlist from your vacuum cleaner.

This approach presents a few advantages when it comes to privacy and network effects. Big tech companies are creating ecosystem of internet-of-things devices. People are buying lightbulbs, security cameras and door locks that work with the Amazon Echo for instance.

But if you can talk to the devices themselves, you don’t need to hook up your devices with a central home speaker — the central hub disappears. If voice assistants are more than a fad, Snips is building some promising technology. And Snips could get some licensing revenue for each device that comes with its voice assistant.
Featured Image: Bryce Durbin/TechCrunch Readmore

Continue Reading

Subscribe to our Newsletter

Trending