This is part two of a two-part article covering AI Builder and the Power Platform. I recommend you skim through part one for some much needed context).

In part one of this two-part series we created an AI Builder model using the Form Processing model type and trained it to extract and set the values for selected fields in documents of a specific type (in my case: a statement of work).

We start with the blueprint for a Content Type for use in SharePoint Online:

Statement of Work - Class View

In this article (part two of the series), we’ll be creating a Flow using Power Automate. It will use our AI Builder model to extract and store metadata for statements of work uploaded to a SharePoint Online.

(I’ll continue to use the statement or work as an example throughout the series. Feel free to substitute it and it’s selected fields, for something more applicable in your own organisation).

Prerequisites

Before we create the Flow, we’ll need to perform the following actions (the Site Owner role will give you sufficient rights to complete this work). I don’t want to go into detail here but I’ve linked to supporting articles if you’re new to SharePoint Online.

With the prerequisite setup complete, our Document Library settings should look something like this:

SharePoint Online - List Settings

 

With that done, we’re ready to build our Flow!

Design

Before I build a Flow using Power Automate, I like to sketch a high level design to capture any decision logic and get a better feel of what the workflow has to do. We can worry about how these steps work when we implement. Here’s what I want the Flow to do:

Flow - Preliminary Design

There are three sub-processes in the Flow. The trigger will be the creation of a document in our Document Library (manual or otherwise). If that document isn’t in pdf format, we’ll need to convert it. This is because AI Builder’s Form Processing model does not support documents in native (Microsoft) Office format. Finally, we’ll need to send the document off to AI Builder so it can be analysed.

Creating the Flow

OK, let’s get building our Flow. The solution is technically a ‘no-code’ solution but we’re going to create some expressions to handle things like token substitution. Think: Excel, not Visual Studio Code.

Creating a Flow is easy. Simply head on over to flow.microsoft.com, sign in and hit + Create. Our Flow will kick-in when someone uploads a document to SharePoint Online, so select the Automated flow template:

Power Automate - Create Flow

The first building block of an automated flow is the trigger. Select (or search for) the trigger called When an item is created or modified and hit Create.

Power Automate - Create Trigger

Configuring the trigger is easy. Simply pick your Site Address from the list provided, and specify the List Name (set a Custom Value if Power Automate is having a hard time resolving your List Name, as it did frequently for me). Click + New Step when ready.

Power Automate - Trigger Settings

Handling Variables

Next, we need to create some variables to store values we’ll need to reference along the way. The Initialize Variable action here to help, so we’ll create one for each variable we need.

  • Initialize a variable called FolderPath (type: string) and set it’s value using the following expression:
substring(triggerOutputs()?['body/{Path}'],0,add(length (triggerOutputs()?['body/{Path}']),-1))

(this is removing the trailing ‘/’ from the relative path to the Document by using the List Item’s native Path property. It sucks but we need pass this value to a web service later in the Flow and it just breaks if you keep that trailing ‘/’ there. )

  • Initialise a variable called PDFUrl (type: string) and leave it’s Value blank for now (this will store a reference to the converted PDF we create later in the Flow).
  • Initialize a variable called PDFContent (type: object) and leave it’s Value blank for now (this will store the content to be passed to AI Builder for conversion later in the Flow).

At this point in the process, your Flow should look like this:

Power Automate - Flow View

(Note: You can rename your triggers and actions to help with readability, as shown in the image above).

Decision Logic

Next, we need another action to perform a check to see if our document is in pdf format. Add another action and search for: Condition. We can use the File Name property in Dynamic content (created by our trigger) and simply check the last three characters in that value (i.e. the file extension).

Power Automate - Dynamic Content

At this point our Flow looks like this and must split into two. The ‘yes’ branch (in cases where the document is is pdf), and the ‘no’ branch (anything else). We’ll handle the ‘yes’ branch first as it’s the easiest.

Power Automate - Quick Recap

The ‘Yes’ Path

We’ll need to create a reference to the content of the document and pass that the AI Builder for analysis. The good news is that if the document is already in pdf format, we can use the Get file content using path action and then assign the value from that to the PDF Content variable we created earlier in the Flow:

  • Add a Get file content using path action to the Yes branch of the Flow. Set the File Path to the Full Path property (under dynamic content) that was created by our trigger.
  • Next, add a Set Variable action. We’re going to assign the File Content property (under dynamic content) created by the previous action to the PDF Content variable we initialized earlier.

Still with me? Your setup for the ‘yes’ branch’ should look like this:

Power Automate - 'Yes' Path

 

The ‘No’ Path

It gets a little tough here, but stick with me. If the document is not in pdf format, we must convert it.

There are a number of options here, including the use of third party actions you can purchase to handle conversion for you. But there is a way to use SharePoint Online to convert the document for you. For this solution I’ve drawn from prescriptive guidance published by Paul Culmsee (@paulculmsee) to help with my solution. Full credit to him for figuring this stuff out!

Conversion works like this:

  • We issue a request to SharePoint Online for the document payload (including valuable meta-data)
  • We parse the resulting JSON so that the meta-data we need is cached as dynamic content
  • We assemble the URL to the converted pdf taking bits from the JSON payload.
  • We issue a HTTP request for the converted pdf and store the body of the response in our PDFContent variable.

OK, let’s go:

  • Add a Send a HTTP request to SharePoint action to the No branch of the Condition. The Uri we want to invoke is as follows:
_api/web/lists/GetbyTitle('')/RenderListDataAsStream?FilterField1=ID&FilterValue1=

(We need to swap out the with the name of your Document Library and the with the ID property of the document that was uploaded there. Fortunately the ID property is available as dynamic content).

The parameters to this request must contain instructions to return a Uri to the pdf version of the document, so we include the following in the body of our request:

{
    "parameters": {
    "RenderOptions" : 4103,
    "FolderServerRelativeUrl" : "/sites/sandpit/@{variables('FolderPath')}"
    }
}

(here we need to send the FolderPath with trailing ‘/’ removed, so we’re using the FolderPath variable we set at the start of the Flow).

So your action configuration should look similar to this:

Power Automate - Send HTTPs request to SharePoint action configuration

Next, we need a copy of the payload schema returned by this service call so we can reference it as dynamic content in our Flow. The easiest way to do this is to Test the Flow and copy the schema. Hit the Test button and select I’ll perform the trigger action and hit Save and Test.

Flow will wait patiently for you to upload a document to your Library to trigger the workflow. Once done, steps in your Flow will be 'ticked off' as they are tested. Examine the Send a HTTP to SharePoint action and copy the body content to your clipboard (we’ll use this to create the next action).

Power Automate - Send HTTP request to SharePoint test output

  • Switch back to edit mode and add a Parse JSON action. We’re going to assign the body property (under dynamic content) created by the previous action to the Content property. To set the schema property, select Generate from sample and paste the body content you copied to your clipboard when you ran the test.

Your action configuration should look similar to this:

Power Automate - Setting the JSON schema

We need to parse the output of the HTTP request because we’re using key values in the payload to create the Url to the converted PDF in our next step.

  • Add a Set Variable action. We’re going to use the following expression to assemble our Url:
@{body('Parse_JSON')?['ListSchema']?['.mediaBaseUrl']}/transform/pdf?provider=spo&inputFormat=@{first(body('Parse_JSON')?['ListData']?['Row'])?['File_x0020_Type']}&cs=@{body('Parse_JSON')?['ListSchema']?['.callerStack']}&docid=@{first(body('Parse_JSON')?['ListData']?['Row'])?['.spItemUrl']}&@{body('Parse_JSON')?['ListSchema']?['.driveAccessToken']}

Your action should look like this:

Power Automate - Composite URL

  • The .mediaBaseUrl property contains the URL to the platform’s media content delivery service (e.g. australiaeast1-mediap.svc.ms)
  • The first expression parses the file type, to pass the format to convert from (e.g. docx)
  • The second expression parses the uri to the pre-converted document in SharePoint Online.
  • The .callerStack and .driveAccessToken properties are encoded strings (tokens) providing useful session context.

(If you want, Test the Flow at this stage. Copy the output of the Set PDF Url action into your browser. If it’s correct, it will display a PDF version of the document you submitted at the start of the test).

  • Next, we need the Flow to request the converted pdf directly. Add a HTTP action to your Flow. (Astonishingly, this action requires a premium license). Assign the PDFUrl variable to the URI property.

Your action should look like this:

Power Automate - HTTP Request Configuration

  • The last action in your ‘no’ branch will assign the body of the response generated by the HTTP action (as dynamic content) to the PDFContent variable we initialized at the start of the Flow.

Your action should look like this:

Power Automate - Recap

Quick Recap

At this stage, your Flow has two branches based on a test to see if the document format is of type pdf. Each branch ultimately set’s the PDFContent variable we’ll use for the final step. In the yes branch, the content is a snapshot of the document uploaded to SharePoint Online. In the no branch, it’s a snapshot of a pdf conversion of that document. Here’s what it should look like:

Power Automate - Recap

Calling AI Builder

The final step of the flow occurs when the two branches converge. We’ll invoke the AI Builder model we created in part one of this article and then update the document’s List Item once we have the meta-data.

  • Search for an action called Process and save information from forms and add it to your Flow where the conditional branches converge. Select your model from the list provided. Use the PDFContent variable (as dynamic content) here.

Your action should look like this:

Power Automate - Analyse the Document action configuration

  • Next, we need to take the output of the analysis and use it to update the column data for the statement of work content type in SharePoint Online. Add an Update Item (SharePoint) action to the Flow. Once you specify the List Name the fields will be loaded in for you. Simply use the dynamic content panel to assign values returned from AI Builder to your columns.

Your action should look like this:

Power Automate - Updating the SharePoint Online List item

As a stretch goal, you can add some resilience to your Flow by adding some conditional logic when you update your list item. AI Builder passes it’s confidence scores back to Flow, so you could, for example, update the list item only if the confidence score is within a specific threshold. For example:

The Final Solution

Your final solution should look something similar to this:

As statement of work documents are uploaded to the Document Library, Flow will pick them up and push the content to your model in AI Builder for analysis. The result is a complete set of metadata for each document without the need to manual intervention.

Final things to note:

  • Processing is asynchronous and will take a few seconds (especially if conversion is needed), so the meta-data will not be available immediately after the document is uploaded. My Flow completed on average in about 15-25 seconds. Your mileage may vary.

Wow. That was quite a lot. Thanks for sticking with me. In this series, we trained and published a model using AI Builder and then created a Flow using Power Automate to invoke it.

The solution is able to analyse documents at they are uploaded to SharePoint Online. Column values are extracted from the document as AI Builder runs the analysis. The Flow updates the associated List Item, assigning values to our Columns.