During the Google I/O keynote address today in San Francisco, Google says it has been busy building deep neural networks that learn and help the company improve its search engine. Google says its neural nets are 30 layers deep, and those layers recognize progressively more complicated things. For example, they can figure out specific shapes and colors
The results are already generating results. Google says its word error rate has dropped from 23% to just 8% in one year. The neural nets can help with other services. For example, the Inbox email app pulls in information based on a trip to London. Also, Google Now can let users know when to leave early for work based on traffic patterns.
Stay tuned as we will continue to post updates from today's Google I/O keynote event.
Galaxy S20 vs. OnePlus 8 camera comparison: Zoom trumps macro
The Galaxy S20 and OnePlus 8 both have triple camera arrays, but Samsung and OnePlus went in different directions for the individual lenses in those systems. Both phones take great photos, but in the end one is more well-rounded and versatile than the other.
Do you think the Pixel 2 is a good purchase in 2020?
The Pixel 2 and 2 XL will be turning three this October. Do you think the phones are still worth picking up here in 2020?
OnePlus disables the OnePlus 8 Pro's color filter camera with new update
The latest OxygenOS update for the OnePlus 8 Pro disables the phone's color filter camera. OnePlus says the camera will be re-enabled sometime next month.
The Last of Us Part II, Star Wars Racer, and more release for PS4 in June
June is typically a slow month for game releases, but there is one major release on the horizon. Here are some great games hitting the PS4 this month.