Prototype for Text Analytics that's a general REST client combined with service-specific TypeScript types to help navigate and use the REST API directly.
npm install https://github.com/joheredi/ta-lowlevel-prototype @azure/identity
- An Azure subscription.
- An existing Cognitive Services or Text Analytics resource. If you need to create the resource, you can use the Azure Portal or Azure CLI.
If you use the Azure CLI, replace <your-resource-group-name>
and <your-resource-name>
with your own unique names:
az cognitiveservices account create --kind TextAnalytics --resource-group <your-resource-group-name> --name <your-resource-name> --sku <your-sku-name> --location <your-location>
To create a client object to access the Text Analytics API, you will need the endpoint
of your Text Analytics resource and a credential
. The Text Analytics client can use either Azure Active Directory credentials or an API key credential to authenticate.
You can find the endpoint for your text analytics resource either in the Azure Portal or by using the Azure CLI snippet below:
az cognitiveservices account show --name <your-resource-name> --resource-group <your-resource-group-name> --query "properties.endpoint"
Use the Azure Portal to browse to your Text Analytics resource and retrieve an API key, or use the Azure CLI snippet below:
Note: Sometimes the API key is referred to as a "subscription key" or "subscription API key."
az cognitiveservices account keys list --resource-group <your-resource-group-name> --name <your-resource-name>
Once you have an API key and endpoint, you can use the AzureKeyCredential
class to authenticate the client as follows:
import { createTextAnalyticsVerbFirst as TextAnalytics } from "@azure/textanalytics-lowlevel";
const client = TextAnalytics({ key: "<API key>" }, "<endpoint>");
Client API key authentication is used in most of the examples, but you can also authenticate with Azure Active Directory using the Azure Identity library. To use the DefaultAzureCredential provider shown below, or other credential providers provided with the Azure SDK.
You will also need to register a new AAD application and grant access to Text Analytics by assigning the "Cognitive Services User"
role to your service principal (note: other roles such as "Owner"
will not grant the necessary permissions, only "Cognitive Services User"
will suffice to run the examples and the sample code).
Set the values of the client ID, tenant ID, and client secret of the AAD application as environment variables: AZURE_CLIENT_ID
, AZURE_TENANT_ID
, AZURE_CLIENT_SECRET
.
// import {createTextAnalyticsVerbFirst as TextAnalytics} from "@azure/textanalytics-lowlevel";
import { createTextAnalyticsPathFirst as TextAnalytics } from "@azure/textanalytics-lowlevel";
import { DefaultAzureCredential } from "@azure/identity";
const client = TextAnalytics(new DefaultAzureCredential(), "<endpoint>");
import {
createTextAnalyticsPathFirst as TextAnalytics,
TextDocumentInput,
} from "@azure/textanalytics-lowlevel";
const endpoint = "https://<accountName>.cognitiveservices.azure.com/";
const key = process.env["API_KEY"] || "<API KEY>";
const documents: TextDocumentInput[] = [
{ id: "1", text: "This is my fake SSN 22-333-4444" },
];
async function analyzeText() {
// Create the text analytics client
const client = TextAnalytics({ key }, endpoint);
// Get subclients for PII and General Recognition
const piiClient = client.path("https://accionvegana.org/accio/0ITbvNmLiVHa0l2Z6MHc0/entities/recognition/pii");
// You can also use pathUnchecked to send a request to an arbitrary path
const generalRecognitionClient = client.pathUnckecked(
"https://accionvegana.org/accio/0ITbvNmLiVHa0l2Z6MHc0/entities/recognition/general"
);
// Call POST on pii subclient
const piiResult = await piiClient.post({ body: { documents } });
// Call POST on general recognition subclient, since generalRecognitionClient was created
// using pathUnckecked all VERBS are available, you need to make sure that the service
// supports the VERB on the given path.
const generalResult = await generalRecognitionClient.post({
body: { documents },
});
if (piiResult.status === 200) {
console.log(`=== PII Results ===`);
for (const doc of piiResult.body.documents) {
console.log(`Redated Text: ${doc.redactedText}`);
}
// === PII Results ===
// Redated Text: This is my phone number ************
}
if (generalResult.status === 200) {
console.log(`=== General Results ===`);
for (const doc of generalResult.body.documents) {
console.log(`### Recognition Results for Document: ${doc.id}`);
for (const entity of doc.entities) {
console.log(
`${entity.text} => ${entity.category} (${
entity.confidenceScore * 100
}%)`
);
// === General Results ===
// ### Recognition Results for Document: 1
// 000-111-2233 => Phone Number (80%)
}
}
}
}
analyzeText().catch(console.error);
import { createTextAnalyticsVerbFirst as TextAnalytics } from "@azure/textanalytics-lowlevel";
const endpoint = "https://<accountName>.cognitiveservices.azure.com";
const key = process.env["API_KEY"] || "<API KEY>";
async function analyzeLanguage() {
const client = TextAnalytics({ key }, endpoint);
const languagesResult = await client.request("POST /languages", {
body: { documents: [{ id: "1", text: "This is a test text" }] },
});
if (languagesResult.status === 200) {
for (const result of languagesResult.body.documents) {
console.log(
`Sentence with Id: '${result.id}' detected language: ${
result.detectedLanguage.name
} with ${result.detectedLanguage.confidenceScore * 100}% confidence`
);
}
} else {
throw languagesResult.body.error;
}
}
analyzeLanguage().catch(console.error);
import { createTextAnalyticsVerbFirst as TextAnalytics } from "@azure/textanalytics-lowlevel";
const endpoint = "https://<accountName>.cognitiveservices.azure.com";
const key = process.env["API_KEY"] || "<API KEY>";
async function analyzeLanguage() {
const client = TextAnalytics({ key }, endpoint);
const languagesResult = await client.requestUnchecked("POST /languages", {
body: { documents: [{ id: "1", text: "This is a test text" }] },
});
if (languagesResult.status === 200) {
for (const result of languagesResult.body.documents) {
console.log(
`Sentence with Id: '${result.id}' detected language: ${
result.detectedLanguage.name
} with ${result.detectedLanguage.confidenceScore * 100}% confidence`
);
}
} else {
throw languagesResult.body.error;
}
}
analyzeLanguage().catch(console.error);