cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
naulacambra
Frequent Visitor

Create work item at Azure DevOps from an email with attachments

My goal here is to crete work items whenever an email arrives.

 

I've successfully created a flow to create a Work Item at my Azure DevOps board when an email arrives.

 

The issue comes when I try to attach the email attachments to the created Work Item.

 

Since there isn't any "direct" way to do it with an action, I've tried something like this solution where it proposes use the Azure DevOps API service to attach the attachment.

 

According to Azure DevOps Api Documentation (here), to attach a file, I should send the binary data as a json, and I've been trying to achieve that with this flow

Capture.PNG

 Capture2.PNG

{
    "type": "object",
    "properties": {
        "Id": {
            "type": "string"
        },
        "Name": {
            "type": "string"
        },
        "ContentBytes": {
            "type": "string"
        },
        "ContentType": {
            "type": "string"
        },
        "Size": {
            "type": "integer"
        }
    }
}

 

 

Capture3.PNG

binary(body('Parse_JSON')?['ContentBytes'])

 

But it doesn't work. Anyone managed to successfully do this? Any hint would be helpful.

 

Thank you

 

14 REPLIES 14

Hi ,

 

Maybe we can turn to @Pieter_Veenstra, and there was a section on "send an http request to sharepoint" in his blog. I think if he has time, he might be able to update the blog about "send an http request to Azure DevOps". That will help us a lot.Smiley Happy

Pieter_Veenstra
Community Champion
Community Champion

Hi @naulacambra,

 

I don't have the Azure Devops setup  to test this out, but lookling at the documentation link that was included in the previous post:

 

https://docs.microsoft.com/en-us/rest/api/vsts/wit/attachments/create?view=vsts-rest-4.1#upload_a_bi...

 

I would say that you need to format the body correctly So it should look somethign like the below JSON. As a test you probably could copy exactly that jason into the body and see if the HTTP request works. Once that works then replace the binary data in the JSON array with your data.

 

{
  "0": 137,
  "1": 80,
  "2": 78,
  "3": 71,
  "4": 13,
  "5": 10,
  "6": 26,
  "7": 10,
  "8": 0,
  "9": 0,
  "10": 0,
  "11": 13,
  "12": 73,
  "13": 72,
  "14": 68,
  "15": 82,
  "16": 0,
  "17": 0,
  "18": 0,
  "19": 24,
  "20": 0,
  "21": 0,
  "22": 0,
  "23": 24,
  "24": 8,
  "25": 2,
  "26": 0,
  "27": 0,
  "28": 0,
  "29": 111,
  "30": 21,
  "31": 170,
  "32": 175,
  "33": 0,
  "34": 0,
  "35": 0,
  "36": 1,
  "37": 115,
  "38": 82,
  "39": 71,
  "40": 66,
  "41": 0,
  "42": 174,
  "43": 206,
  "44": 28,
  "45": 233,
  "46": 0,
  "47": 0,
  "48": 0,
  "49": 4,
  "50": 103,
  "51": 65,
  "52": 77,
  "53": 65,
  "54": 0,
  "55": 0,
  "56": 177,
  "57": 143,
  "58": 11,
  "59": 252,
  "60": 97,
  "61": 5,
  "62": 0,
  "63": 0,
  "64": 0,
  "65": 9,
  "66": 112,
  "67": 72,
  "68": 89,
  "69": 115,
  "70": 0,
  "71": 0,
  "72": 14,
  "73": 195,
  "74": 0,
  "75": 0,
  "76": 14,
  "77": 195,
  "78": 1,
  "79": 199,
  "80": 111,
  "81": 168,
  "82": 100,
  "83": 0,
  "84": 0,
  "85": 0,
  "86": 101,
  "87": 73,
  "88": 68,
  "89": 65,
  "90": 84,
  "91": 56,
  "92": 79,
  "93": 237,
  "94": 204,
  "95": 65,
  "96": 10,
  "97": 192,
  "98": 32,
  "99": 12,
  "100": 68,
  "101": 81,
  "102": 239,
  "103": 127,
  "104": 105,
  "105": 27,
  "106": 240,
  "107": 167,
  "108": 24,
  "109": 146,
  "110": 52,
  "111": 22,
  "112": 138,
  "113": 80,
  "114": 240,
  "115": 237,
  "116": 156,
  "117": 140,
  "118": 211,
  "119": 250,
  "120": 71,
  "121": 206,
  "122": 80,
  "123": 109,
  "124": 227,
  "125": 80,
  "126": 83,
  "127": 188,
  "128": 19,
  "129": 213,
  "130": 217,
  "131": 34,
  "132": 141,
  "133": 164,
  "134": 55,
  "135": 190,
  "136": 70,
  "137": 104,
  "138": 88,
  "139": 73,
  "140": 90,
  "141": 161,
  "142": 55,
  "143": 137,
  "144": 162,
  "145": 53,
  "146": 180,
  "147": 213,
  "148": 198,
  "149": 33,
  "150": 210,
  "151": 60,
  "152": 31,
  "153": 130,
  "154": 33,
  "155": 65,
  "156": 87,
  "157": 249,
  "158": 68,
  "159": 140,
  "160": 230,
  "161": 109,
  "162": 105,
  "163": 200,
  "164": 163,
  "165": 55,
  "166": 249,
  "167": 203,
  "168": 144,
  "169": 224,
  "170": 71,
  "171": 132,
  "172": 134,
  "173": 149,
  "174": 14,
  "175": 9,
  "176": 254,
  "177": 89,
  "178": 220,
  "179": 156,
  "180": 167,
  "181": 161,
  "182": 87,
  "183": 206,
  "184": 80,
  "185": 165,
  "186": 247,
  "187": 11,
  "188": 116,
  "189": 99,
  "190": 71,
  "191": 0,
  "192": 204,
  "193": 122,
  "194": 63,
  "195": 206,
  "196": 0,
  "197": 0,
  "198": 0,
  "199": 0,
  "200": 73,
  "201": 69,
  "202": 78,
  "203": 68,
  "204": 174,
  "205": 66,
  "206": 96,
  "207": 130,
  "208": 0,
  "209": 0,
  "BYTES_PER_ELEMENT": 1,
  "buffer": {
    "0": 137,
    "1": 80,
    "2": 78,
    "3": 71,
    "4": 13,
    "5": 10,
    "6": 26,
    "7": 10,
    "8": 0,
    "9": 0,
    "10": 0,
    "11": 13,
    "12": 73,
    "13": 72,
    "14": 68,
    "15": 82,
    "16": 0,
    "17": 0,
    "18": 0,
    "19": 24,
    "20": 0,
    "21": 0,
    "22": 0,
    "23": 24,
    "24": 8,
    "25": 2,
    "26": 0,
    "27": 0,
    "28": 0,
    "29": 111,
    "30": 21,
    "31": 170,
    "32": 175,
    "33": 0,
    "34": 0,
    "35": 0,
    "36": 1,
    "37": 115,
    "38": 82,
    "39": 71,
    "40": 66,
    "41": 0,
    "42": 174,
    "43": 206,
    "44": 28,
    "45": 233,
    "46": 0,
    "47": 0,
    "48": 0,
    "49": 4,
    "50": 103,
    "51": 65,
    "52": 77,
    "53": 65,
    "54": 0,
    "55": 0,
    "56": 177,
    "57": 143,
    "58": 11,
    "59": 252,
    "60": 97,
    "61": 5,
    "62": 0,
    "63": 0,
    "64": 0,
    "65": 9,
    "66": 112,
    "67": 72,
    "68": 89,
    "69": 115,
    "70": 0,
    "71": 0,
    "72": 14,
    "73": 195,
    "74": 0,
    "75": 0,
    "76": 14,
    "77": 195,
    "78": 1,
    "79": 199,
    "80": 111,
    "81": 168,
    "82": 100,
    "83": 0,
    "84": 0,
    "85": 0,
    "86": 101,
    "87": 73,
    "88": 68,
    "89": 65,
    "90": 84,
    "91": 56,
    "92": 79,
    "93": 237,
    "94": 204,
    "95": 65,
    "96": 10,
    "97": 192,
    "98": 32,
    "99": 12,
    "100": 68,
    "101": 81,
    "102": 239,
    "103": 127,
    "104": 105,
    "105": 27,
    "106": 240,
    "107": 167,
    "108": 24,
    "109": 146,
    "110": 52,
    "111": 22,
    "112": 138,
    "113": 80,
    "114": 240,
    "115": 237,
    "116": 156,
    "117": 140,
    "118": 211,
    "119": 250,
    "120": 71,
    "121": 206,
    "122": 80,
    "123": 109,
    "124": 227,
    "125": 80,
    "126": 83,
    "127": 188,
    "128": 19,
    "129": 213,
    "130": 217,
    "131": 34,
    "132": 141,
    "133": 164,
    "134": 55,
    "135": 190,
    "136": 70,
    "137": 104,
    "138": 88,
    "139": 73,
    "140": 90,
    "141": 161,
    "142": 55,
    "143": 137,
    "144": 162,
    "145": 53,
    "146": 180,
    "147": 213,
    "148": 198,
    "149": 33,
    "150": 210,
    "151": 60,
    "152": 31,
    "153": 130,
    "154": 33,
    "155": 65,
    "156": 87,
    "157": 249,
    "158": 68,
    "159": 140,
    "160": 230,
    "161": 109,
    "162": 105,
    "163": 200,
    "164": 163,
    "165": 55,
    "166": 249,
    "167": 203,
    "168": 144,
    "169": 224,
    "170": 71,
    "171": 132,
    "172": 134,
    "173": 149,
    "174": 14,
    "175": 9,
    "176": 254,
    "177": 89,
    "178": 220,
    "179": 156,
    "180": 167,
    "181": 161,
    "182": 87,
    "183": 206,
    "184": 80,
    "185": 165,
    "186": 247,
    "187": 11,
    "188": 116,
    "189": 99,
    "190": 71,
    "191": 0,
    "192": 204,
    "193": 122,
    "194": 63,
    "195": 206,
    "196": 0,
    "197": 0,
    "198": 0,
    "199": 0,
    "200": 73,
    "201": 69,
    "202": 78,
    "203": 68,
    "204": 174,
    "205": 66,
    "206": 96,
    "207": 130,
    "208": 0,
    "209": 0,
    "byteLength": 210
  },
  "length": 210,
  "byteOffset": 0,
  "byteLength": 210
}

After some more tests, I've sucessfully sent the example json to AzureDevOps as attachments, but now, I have to transform the base64 image that I have at the email object into an array of bytes, such as the example from the documentation. Any tip?

Hi @naulacambra,  I am trying to do the same... do you mind sharing the image of the flow you used to process the attachments?

Hi @maricel0422 at the end I wasn't able to do it "properly". Even though, let me explain my workaround as it could work for you.

 

The mails had two types of attachments, the "in-body" attachments (images inside the email body) and the "attached" attachments. Both of them were described in an array ("attachments") at the end of email object.

 

First, about the first type of attachments;

When an email is received and the flow is triggered, the object received replaced the in-body "src" attribute of the images by something like "src=cid:XXXXXXXXXX@YYYYYYYY". I'm not sure if this is something exclusive of my email client (Outlook.com) or happens all the time. My solution here was to loop through the attachments array, looking for the "id" attribute in the body, and replace the "src" attribute by the base64 representation of the Content-Bytes.

 

About the second type;

My solution was to upload the attachment to OneDrive and then, append the share link into the body content (which later will be the Work Item description). Is not fancy, but it works.

 

Here you will find a link to the json template of my flow. It has a lot more functionalities, but I'm sure you'll be able to extract the part that you're interested in.

 

I hope this helps you

hammer
New Member

I was able to do this using an Azure logic app and an Azure function, instead of flow.  Not sure of your familiarity but from a design/implementation perspective the two (logic app, flow) are similar.  I think my solution would work in flow, I just didn't add a custom connector which I think you need to do in order to call an Azure function from Flow.

 

So, using azure logic app:

 

  • Trigger:  When a new email arrives
    • Check for subject filter to exist
  • Action:  HTML to text (strip out html from email)
  • Action: Create a work item (DevOps)
  • Action:  HttpTrigger (Azure Function)
    • Pass in attachments from email
    • Azure function parses the body (attachments), and creates/attaches for each attachment (inline supported also) - basically just does 2 web requests (for each attachment passed in):
      • POST to create the attachment
      • PATCH to add the attachment to the specific work item
  • Action: Send user email with DevOps ticket # (id)

 

If anyone would like me to elaborate further or has any questions please let me know.  Tested and works.  

Hi @hammer,

would you mind to elaborate the Azure Function part? Even better if you could share your code directly.

 

thanks a lot!

Yes, a more specific example of what you did (codez plz, LOL).

I am new to TFS and DevOps and we are replacing existing Mantis functionality here.

Attachements are a big part (we get screenshots of issues)...

hammer
New Member

So, here is what I did on a more granular level, this also now uses MS Flow and not a logic application:

 

  1. Create Azure Function to handle uploading/associating an attachment to a DevOps workitem
    1. This will also need a DevOps API token
  2. Create an Azure API that will let you access the Azure Function via web request
  3. Send Attachment data in MS Flow using http request action to the Azure API

 

Microsoft Flow:

  • URI = URL created in Azure API that will hit the azure function
  • SubscriptionKey = Key for subscription in Azure API
  • Body:
    • Attachments - this comes from email trigger (not in pic)
    • Subject - this comes from email trigger
    • WorkItemId - this comes from 'create work item 2' step so the function knows which work item to work with

 

Azure Function Code:

#r "Newtonsoft.Json"

using System.Net;
using System.IO;
using System.Text;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;

public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
    log.LogInformation("C# HTTP trigger function processed a request.");

    string name = req.Query["name"];
    string workItemId = req.Query["workItemId"];

    string tokenFromDevops = @"{TOKEN}";
    string urlDevopsOrgAddAttach = @"https://dev.azure.com/{ORGNAME}/_apis/wit/attachments?api-version=5.0&fileName=";

    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    dynamic data = JsonConvert.DeserializeObject(requestBody);
    
    name = name ?? data?.name;
    workItemId = workItemId ?? data?.workItemId;

    foreach (var a in data.attachments)
    {
        string attachmentBytes = a.ContentBytes;
        string attachmentFileName = a.Name;
        var bytesFromB = Convert.FromBase64String(attachmentBytes);
        string url = urlDevopsOrgAddAttach + attachmentFileName;
        HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
        request.KeepAlive = false;
        request.Method = "POST";
        request.Headers.Add("Authorization", "Basic " + tokenFromDevops);
        request.ContentType = "application/octet-stream";
        Stream requestStream = request.GetRequestStream();
        requestStream.Write(bytesFromB, 0, bytesFromB.Length);
        requestStream.Close();

        HttpWebResponse response = (HttpWebResponse)request.GetResponse();
        var rspUploadAttachment = new StreamReader(response.GetResponseStream()).ReadToEnd();
        dynamic jsonResponse = JsonConvert.DeserializeObject(rspUploadAttachment);
        String urlUploadedAttachment = jsonResponse.url;

        // now we need to associate the uploaded attachment to an actual work item
        string urlWorkItem = @"https://dev.azure.com/{ORGNAME}/{PROJECTNAME}/_apis/wit/workitems/" + workItemId + "?api-version=5.0";  
        
        HttpWebRequest requestAttach = (HttpWebRequest)WebRequest.Create(urlWorkItem);
        requestAttach.Method = "PATCH";
        requestAttach.Accept = "application/json";
        requestAttach.Headers.Add("Authorization", "Basic " + tokenFromDevops);
        requestAttach.ContentType = "application/json-patch+json";
        
        string jsonPatchAttach = @"[
  {
    'op': 'add',
    'path': '/fields/System.History',
    'value': 'Adding files from Azure automation'
  },
  {
    'op': 'add',
    'path': '/relations/-',
    'value': {
      'rel': 'AttachedFile',
      'url': 'urlReplaceToken',
      'attributes': {
        'comment': 'Attachment added from Azure automation'
      }
    }
  }
]";
        // this can just be done in json above, but left as-is
        jsonPatchAttach = jsonPatchAttach.Replace("urlReplaceToken", urlUploadedAttachment);
        using (var streamWriter = new StreamWriter(requestAttach.GetRequestStream()))
            {                
                streamWriter.Write(jsonPatchAttach);
                streamWriter.Flush();
           }

        HttpWebResponse responseAttach = (HttpWebResponse)requestAttach.GetResponse();
        var rspAddAttachment = new StreamReader(responseAttach.GetResponseStream()).ReadToEnd();
        // do whatever you want here with response - or nothing
        // dynamic jsonResponse = JsonConvert.DeserializeObject(rspAddAttachment);
    }

    return name != null
        ? (ActionResult)new OkObjectResult($"WorkItemID Created!")
        : new BadRequestObjectResult("Please pass a name on the query string or in the request body");
}

 

 

Thanks @hammer you saved me a ton of time getting up and running with a bug tracking implementation using Power Automate Flow and an Azure Function similar to the one you shared. 

 

Also FYI to anyone who also ran into the issue with the DevOps "Send an HTTP request to Azure DevOps" not handling image uploads properly I think I may have found out why. See github @obvioussean's comment here:

https://github.com/MicrosoftDocs/vsts-rest-api-specs/issues/211#issuecomment-503719409

 

Apparently the DevOps Flow connector for "Send an HTTP request to Azure DevOps" is not sending the binary content as true binary but rather as a string. This explains why the image that gets "uploaded" is corrupted. 

 

If anyone else was referring to the DevOps API for attachments when implementing this you were probably also deceived by the request body portion showing a String type. According to github user @obvioussean in the same issue he mentions that the documentation system created this incorrectly.  

@naulacambra Do you know how to load json files to power automate to create a flow? 

Sounds like your solution you shared via json file is the flow I need. 

Hi @ka05th30ry ,

 

just have exactly your described problem with DevOps. 

Have an successfull flow with "Send an HTTP request to Azure DevOps" and cannot open my images. 

See also my issue: https://powerusers.microsoft.com/t5/Using-Flows/Send-an-HTTP-request-to-Azure-DevOps-Successfull-but...

 

So I guess you know the solution how to send the binary content correctly?

Just using the "HTTP" connector instead of the DevOps one? If so, how to create the binary content?

 

Have already tried it with the "HTTP" connector with following body:

 

base64ToBinary(items('Apply_to_each')?['contentBytes'])
 
...but same issue.
 
Would be great if you could help me.
 
Thank you!
 
Cheers,
Sven

There are some hurdles with the coding to solve. I've found a video which explained it very well: (24) Power Automate - Create Azure DevOps WorkItem and Attachments - YouTube

nehaparakh
Regular Visitor

@Beichler77 i followed the video but still i am getting the following error in the last step of patching have attached the screenshot please help me to identify the issue 

nehaparakh_0-1694788030613.png

 

Helpful resources

Announcements

Community will be READ ONLY July 16th, 5p PDT -July 22nd

Dear Community Members,   We'd like to let you know of an upcoming change to the community platform: starting July 16th, the platform will transition to a READ ONLY mode until July 22nd.   During this period, members will not be able to Kudo, Comment, or Reply to any posts.   On July 22nd, please be on the lookout for a message sent to the email address registered on your community profile. This email is crucial as it will contain your unique code and link to register for the new platform encompassing all of the communities.   What to Expect in the New Community: A more unified experience where all products, including Power Apps, Power Automate, Copilot Studio, and Power Pages, will be accessible from one community.Community Blogs that you can syndicate and link to for automatic updates. We appreciate your understanding and cooperation during this transition. Stay tuned for the exciting new features and a seamless community experience ahead!

Summer of Solutions | Week 4 Results | Winners will be posted on July 24th

We are excited to announce the Summer of Solutions Challenge!    This challenge is kicking off on Monday, June 17th and will run for (4) weeks.  The challenge is open to all Power Platform (Power Apps, Power Automate, Copilot Studio & Power Pages) community members. We invite you to participate in a quest to provide solutions to as many questions as you can. Answers can be provided in all the communities.    Entry Period: This Challenge will consist of four weekly Entry Periods as follows (each an “Entry Period”)   - 12:00 a.m. PT on June 17, 2024 – 11:59 p.m. PT on June 23, 2024 - 12:00 a.m. PT on June 24, 2024 – 11:59 p.m. PT on June 30, 2024 - 12:00 a.m. PT on July 1, 2024 – 11:59 p.m. PT on July 7, 2024 - 12:00 a.m. PT on July 8, 2024 – 11:59 p.m. PT on July 14, 2024   Entries will be eligible for the Entry Period in which they are received and will not carryover to subsequent weekly entry periods.  You must enter into each weekly Entry Period separately.   How to Enter: We invite you to participate in a quest to provide "Accepted Solutions" to as many questions as you can. Answers can be provided in all the communities. Users must provide a solution which can be an “Accepted Solution” in the Forums in all of the communities and there are no limits to the number of “Accepted Solutions” that a member can provide for entries in this challenge, but each entry must be substantially unique and different.    Winner Selection and Prizes: At the end of each week, we will list the top ten (10) Community users which will consist of: 5 Community Members & 5 Super Users and they will advance to the final drawing. We will post each week in the News & Announcements the top 10 Solution providers.  At the end of the challenge, we will add all of the top 10 weekly names and enter them into a random drawing.  Then we will randomly select ten (10) winners (5 Community Members & 5 Super Users) from among all eligible entrants received across all weekly Entry Periods to receive the prize listed below. If a winner declines, we will draw again at random for the next winner.  A user will only be able to win once overall. If they are drawn multiple times, another user will be drawn at random.  Individuals will be contacted before the announcement with the opportunity to claim or deny the prize.  Once all of the winners have been notified, we will post in the News & Announcements of each community with the list of winners.   Each winner will receive one (1) Pass to the Power Platform Conference in Las Vegas, Sep. 18-20, 2024 ($1800 value). NOTE: Prize is for conference attendance only and any other costs such as airfare, lodging, transportation, and food are the sole responsibility of the winner. Tickets are not transferable to any other party or to next year’s event.   ** PLEASE SEE THE ATTACHED RULES for this CHALLENGE**   Week 1 Results: Congratulations to the Week 1 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge.   Community MembersNumber SolutionsSuper UsersNumber Solutions Deenuji 9 @NathanAlvares24  17 @Anil_g  7 @ManishSolanki  13 @eetuRobo  5 @David_MA  10 @VishnuReddy1997  5 @SpongYe  9JhonatanOB19932 (tie) @Nived_Nambiar  8 @maltie  2 (tie)   @PA-Noob  2 (tie)   @LukeMcG  2 (tie)   @tgut03  2 (tie)       Week 2 Results: Congratulations to the Week 2 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 2: Community MembersSolutionsSuper UsersSolutionsPower Automate  @Deenuji  12@ManishSolanki 19 @Anil_g  10 @NathanAlvares24  17 @VishnuReddy1997  6 @Expiscornovus  10 @Tjan  5 @Nived_Nambiar  10 @eetuRobo  3 @SudeepGhatakNZ 8     Week 3 Results: Congratulations to the Week 3 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 3:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji32ManishSolanki55VishnuReddy199724NathanAlvares2444Anil_g22SudeepGhatakNZ40eetuRobo18Nived_Nambiar28Tjan8David_MA22   Week 4 Results: Congratulations to the Week 4 qualifiers, you are being entered in the random drawing that will take place at the end of the challenge. Week 4:Community MembersSolutionsSuper UsersSolutionsPower Automate Deenuji11FLMike31Sayan11ManishSolanki16VishnuReddy199710creativeopinion14Akshansh-Sharma3SudeepGhatakNZ7claudiovc2CFernandes5 misc2Nived_Nambiar5 Usernametwice232rzaneti5 eetuRobo2   Anil_g2   SharonS2  

Check Out | 2024 Release Wave 2 Plans for Microsoft Dynamics 365 and Microsoft Power Platform

On July 16, 2024, we published the 2024 release wave 2 plans for Microsoft Dynamics 365 and Microsoft Power Platform. These plans are a compilation of the new capabilities planned to be released between October 2024 to March 2025. This release introduces a wealth of new features designed to enhance customer understanding and improve overall user experience, showcasing our dedication to driving digital transformation for our customers and partners.    The upcoming wave is centered around utilizing advanced AI and Microsoft Copilot technologies to enhance user productivity and streamline operations across diverse business applications. These enhancements include intelligent automation, AI-powered insights, and immersive user experiences that are designed to break down barriers between data, insights, and individuals. Watch a summary of the release highlights.    Discover the latest features that empower organizations to operate more efficiently and adaptively. From AI-driven sales insights and customer service enhancements to predictive analytics in supply chain management and autonomous financial processes, the new capabilities enable businesses to proactively address challenges and capitalize on opportunities.    

Updates to Transitions in the Power Platform Communities

We're embarking on a journey to enhance your experience by transitioning to a new community platform. Our team has been diligently working to create a fresh community site, leveraging the very Dynamics 365 and Power Platform tools our community advocates for.  We started this journey with transitioning Copilot Studio forums and blogs in June. The move marks the beginning of a new chapter, and we're eager for you to be a part of it. The rest of the Power Platform product sites will be moving over this summer.   Stay tuned for more updates as we get closer to the launch. We can't wait to welcome you to our new community space, designed with you in mind. Let's connect, learn, and grow together.   Here's to new beginnings and endless possibilities!   If you have any questions, observations or concerns throughout this process please go to https://aka.ms/PPCommSupport.   To stay up to date on the latest details of this migration and other important Community updates subscribe to our News and Announcements forums: Copilot Studio, Power Apps, Power Automate, Power Pages

Users online (1,599)