How to Launch and Maintain Enterprise AI Products

There is a significant disconnect between the perception and reality of how enterprise AI products are built.

The narrative seems to be that given a business problem, the data science team sets about gathering a training dataset (for supervised problems) that reflects the desired output; when the dataset has been built, the team switches to the modelling phase during which various networks are experimented with, at the conclusion of which the network with the best metrics is deployed to production … job done, give it enough time and watch the money roll in.

It’s Never That Simple

My experience building enterprise B2B products has been that this is only where the work begins. Freshly minted models rarely satisfy customers’ needs, out of the box, for two common reasons.

Click here to read the rest of this article on the Semantics3 blog

Deriving Meaning through Machine Learning: The Next Chapter in Retail

Three slides from Benedict Evans’ brilliant talk, The End of the Beginning, really caught my attention.

The Old and the New

Across industries, machine learning is helping us get to successive levels of meaning of what a thing is. It’s helping us explicitly understand what things are, a leap forward from the existing state of affairs, which rely on extrapolation through indirect inference. In the context of retail, the implications of this are significant. Here’s why.

Click here to read the rest of this article on the Semantics3 blog

Product Matching — A Visual Tribute

Product matching is a challenging data-science problem that we’ve been battling for several years at Semantics3. The variety of concepts and nuances that need to be taken into consideration to tame this problem has reduced our data-scientists to tears on more than one occasion.

In this week’s post, we decided to pay a visual tribute to product matching by showcasing some of the particularly difficult examples that we’ve come across over the years. Enjoy!

Click here to read the rest of this article on Medium

The GIGO Principle in Machine Learning

And its implications for PMs, designers, salespeople and data scientists

Garbage-In-Garbage-Out is the idea that the output of an algorithm, or any computer function for that matter, is only as good as the quality of the input that it receives.

The principle underlying GIGO is essential when it comes to the real world deployment of algorithms. And with the increasing usage of ML in everything from public-facing APIs to the underlying services that power public-facing applications, awareness and assimilation of this principle is as important now as it has ever been.


Click here to read the rest of this article on Medium

“Hot Dog and a Not Hot Dog”: The Distinction Matters (Code Included)

And why Periscope Should’ve Held Out for a Little Longer

Spoiler Alert: This article references a recent episode of the show Silicon Valley. It only refers to material already provided in HBO released previews, but if you’d like to stay completely out of the know, look away now.

In a recent episode of HBO’s “Silicon Valley”, one of the characters Mr. Jian-Yang builds an app called “Not Hotdog”. The app allows users to identify whether objects are or are not hot dogs. At face value, it seems to be of little use, but it turns out to have very interesting wider applicability (watch the episode to find out more).

One of the comedic quirks is that the character, Mr. Jian-Yang, insists that the app enables two different tasks:

  1. Identifies whether an object is a hot dog.
  2. Identifies whether an object is not a hot dog.

Questions & Intuition for Tackling Deep Learning Problems

Working on data-science problems can be both exhilarating and frustrating. Exhilarating because the occasional insight that boosts your algorithm’s performance can leave you with a lasting high. Frustrating, because you’ll often find yourself at the dead-end of a one-way street, wondering what went wrong.

In this article, I’d like to recount five key lessons that I’ve learnt after one too many walks down dead alleyways. I’ve framed these as five questions that I’ve learned to ask myself before taking on new problems or approaches:

  • Question #1: Never mind a neural network; can a human with no prior knowledge, educated on nothing but a diet of your training dataset, solve the problem?
  • Question #2: Is your network looking at your data through the right lens?
  • Question #3: Is your network learning the quirks in your training dataset, or is it learning to solve the problem at hand?
  • Question #4: Does your network have siblings that can give it a leg-up (through pre-trained weights)?
  • Question #5: Is your network incapable or just lazy? If it’s the latter, how do you force it to learn?

Click here to read the rest of this article on Medium

Inside the AI-as-a-Service Phenomenon

And a peek into AWS’s latest playbook

In his 2017 shareholder letter, Jeff Bezos hinted at a new wave of AWS value-add products — AI-powered APIs.

“Amazon Lex (what’s inside Alexa), Amazon Polly, and Amazon Rekognition remove the heavy lifting from natural language understanding, speech generation, and image analysis. They can be accessed with simple API calls — no machine learning expertise required. Watch this space. Much more to come.” — Jeff Bezos

Quote from Jeff Bezos’ 2017 shareholder letter — Credit GeekWire

In the past, AWS has been prudent in shaping and keeping up with trends, be it hosted DBaaS (database management as a service) or even hosted messaging and application services. And there’s every reason to assume that this new wave of products will prove just as successful.

Even though the concept of AI-as-a-Service is buzzword laden — AI and SaaS are, after all, the driving trends of the day — it’s worth looking past the hype at the underlying factors which could make this trend prove seminal.

Click here to read the rest of this article on Medium

Ecommerce is in Warp Speed: 10 Market Trends that are happening now

Artificial intelligence. Chatbots. Voice search. Virtual reality. Self-driving cars. Technology in 2017 and beyond sure promises to be exciting for consumers. For tech-centric businesses, these trends can be game-changing for those who adapt and overwhelming for those who struggle to keep up. As with all tech waves, disruption will follow close at heel and winners and losers will be anointed when the dust settles.

In this three-part series, we take a look at one niche in particular that is poised for change — product search in Ecommerce. That is, the process by which consumers discover and purchase products online, and the digital interfaces that they use to express intent. In part 1, we look at how the Ecommerce search experience is likely to evolve for consumers in the coming years. In part 2, we look at the technology that will enable these changes. In part 3, the final installment, we take a look at how these changes will affect the ecosystem of Ecommerce businesses, especially online retailers and the companies that support them.

Star Trek is a beautiful show. Its vision of a possible style of future for mankind — one in which society is brought together by science and technology — also envisions a future where we explore new frontiers with the best in tech.

If you like Star Trek, you’re probably also a fan of the gadgets that appear on the show — voice interface devices, replicators, holodecks and tricorders.

Have you ever wondered about how these devices came to be? Or about who invented these gadgets? And which companies manufactured them? Were these people elevated in the hierarchy in recognition of their efforts? Did these companies mint a proverbial fortune? And what happened to those who were vested in older technology when newer gadgets were invented?

Click here to read the rest of this article on Medium

Why Tech Conversations with Grandpa Matter

Have you ever had a family member ring you up for advice on how to use their phone or computer? That dreaded call from an aunt or grandparent about initiating a Hangouts session or using the printer? The one with questions that seem so obvious to you, that you get frustrated by how long it takes the other party to come around to the solution?

And have you noticed how of late, these questions have been coming your way a little more frequently?

These tech conversations with family are far more than innocuous time sinks or sources of amusement. They are representative of meaningful trends that could impact your life.

Click here to read the rest of this article on Medium