Azure Cognitive Services – Vision API

Microsoft Cognitive Service (earlier called as  Project Oxford) is a bunch of APIs, SDKs and services available to software developers to build  intelligent, smart and discoverable applications .

Microsoft Cognitive Services focuses on Microsoft’s growing portfolio of machine learning APIs and it easily enables developers to add intelligent features – like detecting emotion in images; facial recognition, speech translation and vision recognition; and speech and language understanding – into their applications.

Microsoft’s vision is provide more and more personal computing experiences and enhanced productivity to each person and organisation on this planet by intelligent systems that increasingly can see, hear, speak, understand and even begin to reason.


Vision API renders information about visual content found in an image. It use tagging, domain-specific models and descriptions in four languages to identify content and label it with confidence. You can apply the adult/racy settings to help you detect potential adult content. Vision API help you Identify image types and color schemes in pictures.

Vision API detect and extract handwritten text from notes, letters, essays, whiteboards, forms and other sources.We can reduce paper clutter and be more productive by taking photos of handwritten notes instead of transcribing them and make the digital notes easy to find by implementing search.

To use Microsoft Cognitive Service Vision APIs, you first need to create an account in the Azure portal.

  1. Now Sign to Azure Portal.
  2. Click Create a New Resource
  3. Select AI+Machine Learning Category
  4. Select Computer Vision
  5. Fill the required details as shown below and click Create.

CreateVisionAPI.PNG

Now copy the keys as shown below.

keys.PNG

Now I am going to analyse this image which is stored locally on my system

C:ImagesLocalImage.png

LocalImage.png

and this image which is hosted on this blog having public URL of this image as “https://CloudAndMobileBlogcom.files.wordpress.com/2018/08/18595200_1789876414372358_3315871681796401471_o.jpg

18595200_1789876414372358_3315871681796401471_o

I wrote following code to analyze these images

using Microsoft.Azure.CognitiveServices.Vision.ComputerVision;
using Microsoft.Azure.CognitiveServices.Vision.ComputerVision.Models;
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading.Tasks;
namespace TestVisionAPI
{
class Program
{
private const string subscriptionKey = "bcce77ac85884d649f220d26da0b9c90";
private const string localImagePath = @"C:\Images\LocalImage.png";
private const string remoteImageUrl =
"https://cloudandmobileblogcom.files.wordpress.com/2018/08/18595200_1789876414372358_3315871681796401471_o.jpg";
// Specify the features to return
private static readonly List<VisualFeatureTypes> features =
new List<VisualFeatureTypes>()
{
VisualFeatureTypes.Categories, VisualFeatureTypes.Description,
VisualFeatureTypes.Faces, VisualFeatureTypes.ImageType,
VisualFeatureTypes.Tags
};
static void Main(string[] args)
{
ComputerVisionClient computerVision = new ComputerVisionClient(
new ApiKeyServiceClientCredentials(subscriptionKey),
new System.Net.Http.DelegatingHandler[] { });
// You must use the same region as you used to get your subscription
// keys. For example, if you got your subscription keys from westus,
// replace "Westcentralus" with "Westus".
//
// Free trial subscription keys are generated in the westcentralus
// region. If you use a free trial subscription key, you shouldn't
// need to change the region.
// Specify the Azure region
//computerVision.Azure = AzureRegions.Centralindia;
Console.WriteLine("Images being analyzed …");
var t1 = AnalyzeRemoteAsync(computerVision, remoteImageUrl);
var t2 = AnalyzeLocalAsync(computerVision, localImagePath);
Task.WhenAll(t1, t2).Wait(5000);
Console.WriteLine("Press any key to exit");
Console.ReadLine();
}
// Analyze a remote image
private static async Task AnalyzeRemoteAsync(
ComputerVisionClient computerVision, string imageUrl)
{
if (!Uri.IsWellFormedUriString(imageUrl, UriKind.Absolute))
{
Console.WriteLine(
"\nInvalid remoteImageUrl:\n{0} \n", imageUrl);
return;
}
ImageAnalysis analysis =
await computerVision.AnalyzeImageAsync(imageUrl, features);
DisplayResults(analysis, imageUrl);
}
// Analyze a local image
private static async Task AnalyzeLocalAsync(
ComputerVisionClient computerVision, string imagePath)
{
if (!File.Exists(imagePath))
{
Console.WriteLine(
"\nUnable to open or read localImagePath:\n{0} \n", imagePath);
return;
}
using (Stream imageStream = File.OpenRead(imagePath))
{
ImageAnalysis analysis = await computerVision.AnalyzeImageInStreamAsync(
imageStream, features);
DisplayResults(analysis, imagePath);
}
}
// Display the most relevant caption for the image
private static void DisplayResults(ImageAnalysis analysis, string imageUri)
{
Console.WriteLine(imageUri);
Console.WriteLine(analysis.Description.Captions[0].Text + "\n");
}
}
}

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.