5

Stream live with Media Services by using .NET 5.0 - Azure Media Services | Micro...

 2 years ago
source link: https://docs.microsoft.com/en-us/azure/media-services/latest/stream-live-tutorial-with-api
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Tutorial: Stream live with Media Services by using .NET 5.0

  • Article
  • 03/22/2022
  • 21 minutes to read

In Azure Media Services, live events are responsible for processing live streaming content. A live event provides an input endpoint (ingest URL) that you then provide to a live encoder. The live event receives input streams from the live encoder and makes them available for streaming through one or more streaming endpoints. Live events also provide a preview endpoint (preview URL) that you use to preview and validate your stream before further processing and delivery.

This tutorial shows how to use .NET 5.0 to create a pass-through type of a live event. In this tutorial, you will:

Even though the tutorial uses .NET SDK examples, the general steps are the same for REST API, CLI, or other supported SDKs.

Prerequisites

You need the following items to complete the tutorial:

You need these additional items for live-streaming software:

  • A camera or a device (like a laptop) that's used to broadcast an event.

  • An on-premises software encoder that encodes your camera stream and sends it to the Media Services live-streaming service through the Real-Time Messaging Protocol (RTMP). For more information, see Recommended on-premises live encoders. The stream has to be in RTMP or Smooth Streaming format.

    This sample assumes that you'll use Open Broadcaster Software (OBS) Studio to broadcast RTMP to the ingest endpoint. Install OBS Studio.

Review Live streaming with Media Services v3 before proceeding.

Download and configure the sample

Clone the GitHub repository that contains the live-streaming .NET sample to your machine by using the following command:

git clone https://github.com/Azure-Samples/media-services-v3-dotnet.git

The live-streaming sample is in the Live folder.

Open appsettings.json in your downloaded project. Replace the values with credentials that you got from accessing APIs.

You can also use the .env file format at the root of the project to set your environment variables only once for all projects in the .NET samples repository. Just copy the sample.env file, and then fill out the information that you got from the Media Services API Access page in the Azure portal or from the Azure CLI. Rename the sample.env file to just .env to use it across all projects.

The .gitignore file is already configured to prevent publishing this file into your forked repository.

Important

This sample uses a unique suffix for each resource. If you cancel the debugging or terminate the app without running it through, you'll end up with multiple live events in your account.

Be sure to stop the running live events. Otherwise, you'll be billed!

Examine the code that performs live streaming

This section examines functions defined in the Authentication.cs file (in the Common_Utils folder) and Program.cs file of the LiveEventWithDVR project.

The sample creates a unique suffix for each resource so that you don't have name collisions if you run the sample multiple times without cleaning up.

Start using Media Services APIs with the .NET SDK

Authentication.cs creates a AzureMediaServicesClient object using credentials supplied in the local configuration files (appsettings.json or .env).

An AzureMediaServicesClient object allows you to start using Media Services APIs with .NET. To create the object, you need to supply credentials for the client to connect to Azure by using Azure Active Directory, which is implemented in GetCredentialsAsync. Another option is to use interactive authentication, which is implemented in GetCredentialsInteractiveAuthAsync.

public static async Task<IAzureMediaServicesClient> CreateMediaServicesClientAsync(ConfigWrapper config, bool interactive = false)
{
    ServiceClientCredentials credentials;
    if (interactive)
        credentials = await GetCredentialsInteractiveAuthAsync(config);
    else
        credentials = await GetCredentialsAsync(config);

    return new AzureMediaServicesClient(config.ArmEndpoint, credentials)
    {
        SubscriptionId = config.SubscriptionId,
    };
}

In the code that you cloned at the beginning of the article, the GetCredentialsAsync function creates the ServiceClientCredentials object based on the credentials supplied in the local configuration file (appsettings.json) or through the .env environment variables file in the root of the repository.

private static async Task<ServiceClientCredentials> GetCredentialsAsync(ConfigWrapper config)
{
    // Use ConfidentialClientApplicationBuilder.AcquireTokenForClient to get a token using a service principal with symmetric key

    var scopes = new[] { config.ArmAadAudience + "/.default" };

    var app = ConfidentialClientApplicationBuilder.Create(config.AadClientId)
        .WithClientSecret(config.AadSecret)
        .WithAuthority(AzureCloudInstance.AzurePublic, config.AadTenantId)
        .Build();

    var authResult = await app.AcquireTokenForClient(scopes)
                                             .ExecuteAsync()
                                             .ConfigureAwait(false);

    return new TokenCredentials(authResult.AccessToken, TokenType);
}

In the case of interactive authentication, the GetCredentialsInteractiveAuthAsync function creates the ServiceClientCredentials object based on an interactive authentication and the connection parameters supplied in the local configuration file (appsettings.json) or through the .env environment variables file in the root of the repository. In that case, AADCLIENTID and AADSECRET are not needed in the configuration or environment variables file.

private static async Task<ServiceClientCredentials> GetCredentialsInteractiveAuthAsync(ConfigWrapper config)
{
    var scopes = new[] { config.ArmAadAudience + "/user_impersonation" };

    // client application of Az Cli
    string ClientApplicationId = "04b07795-8ddb-461a-bbee-02f9e1bf7b46";

    AuthenticationResult result = null;

    IPublicClientApplication app = PublicClientApplicationBuilder.Create(ClientApplicationId)
        .WithAuthority(AzureCloudInstance.AzurePublic, config.AadTenantId)
        .WithRedirectUri("http://localhost")
        .Build();

    var accounts = await app.GetAccountsAsync();

    try
    {
        result = await app.AcquireTokenSilent(scopes, accounts.FirstOrDefault()).ExecuteAsync();
    }
    catch (MsalUiRequiredException)
    {
        try
        {
            result = await app.AcquireTokenInteractive(scopes).ExecuteAsync();
        }
        catch (MsalException maslException)
        {
            Console.Error.WriteLine($"ERROR: MSAL interactive authentication exception with code '{maslException.ErrorCode}' and message '{maslException.Message}'.");
        }
    }
    catch (MsalException maslException)
    {
        Console.Error.WriteLine($"ERROR: MSAL silent authentication exception with code '{maslException.ErrorCode}' and message '{maslException.Message}'.");
    }

    return new TokenCredentials(result.AccessToken, TokenType);
}

Create a live event

This section shows how to create a pass-through type of live event (LiveEventEncodingType set to None). For information about the available types, see Live event types. In addition to pass-through, you can use a live transcoding event for 720p or 1080p adaptive bitrate cloud encoding.

You might want to specify the following things when you're creating the live event:

  • The ingest protocol for the live event. Currently, the RTMP, RTMPS, and Smooth Streaming protocols are supported. You can't change the protocol option while the live event or its associated live outputs are running. If you need different protocols, create a separate live event for each streaming protocol.

  • IP restrictions on the ingest and preview. You can define the IP addresses that are allowed to ingest a video to this live event. Allowed IP addresses can be specified as one of these choices:

    • A single IP address (for example, 10.0.0.1)
    • An IP range that uses an IP address and a Classless Inter-Domain Routing (CIDR) subnet mask (for example, 10.0.0.1/22)
    • An IP range that uses an IP address and a dotted decimal subnet mask (for example, 10.0.0.1(255.255.252.0))

    If no IP addresses are specified and there's no rule definition, then no IP address will be allowed. To allow any IP address, create a rule and set 0.0.0.0/0. The IP addresses have to be in one of the following formats: IPv4 address with four numbers or a CIDR address range.

  • Autostart on an event as you create it. When autostart is set to true, the live event will start after creation. That means the billing starts as soon as the live event starts running. You must explicitly call Stop on the live event resource to halt further billing. For more information, see Live event states and billing.

    Standby modes are available to start the live event in a lower-cost "allocated" state that makes it faster to move to a running state. This is useful for situations like hot pools that need to hand out channels quickly to streamers.

  • A static host name and a unique GUID. For an ingest URL to be predictive and easier to maintain in a hardware-based live encoder, set the useStaticHostname property to true. For detailed information, see Live event ingest URLs.

Console.WriteLine($"Creating a live event named {liveEventName}");
Console.WriteLine();

// Creating the LiveEvent - the primary object for live streaming in AMS. 
// See the overview - https://docs.microsoft.com/azure/media-services/latest/live-streaming-overview

// Create the LiveEvent

// Understand the concepts of what a live event and a live output is in AMS first!
// Read the following - https://docs.microsoft.com/azure/media-services/latest/live-events-outputs-concept
// 1) Understand the billing implications for the various states
// 2) Understand the different live event types, pass-through and encoding
// 3) Understand how to use long-running async operations 
// 4) Understand the available Standby mode and how it differs from the Running Mode. 
// 5) Understand the differences between a LiveOutput and the Asset that it records to.  They are two different concepts.
//    A live output can be considered as the "tape recorder" and the Asset is the tape that is inserted into it for recording.
// 6) Understand the advanced options such as low latency, and live transcription/captioning support. 
//    Live Transcription - https://docs.microsoft.com/en-us/azure/media-services/latest/live-transcription
//    Low Latency - https://docs.microsoft.com/en-us/azure/media-services/latest/live-event-latency

// When broadcasting to a live event, please use one of the verified on-premises live streaming encoders.
// While operating this tutorial, it is recommended to start out using OBS Studio before moving to another encoder. 

// Note: When creating a LiveEvent, you can specify allowed IP addresses in one of the following formats:                 
//      IpV4 address with 4 numbers
//      CIDR address range  

IPRange allAllowIPRange = new(
    name: "AllowAll",
    address: "0.0.0.0",
    subnetPrefixLength: 0
);

// Create the LiveEvent input IP access control object
// this will control the IP that the encoder is running on and restrict access to only that encoder IP range.
LiveEventInputAccessControl liveEventInputAccess = new()
{
    Ip = new IPAccessControl(
            allow: new IPRange[]
            {
                // re-use the same range here for the sample, but in production you can lock this
                // down to the ip range for your on-premises live encoder, laptop, or device that is sending
                // the live stream
                allAllowIPRange
            }
        )

};

// Create the LiveEvent Preview IP access control object. 
// This will restrict which clients can view the preview endpoint
LiveEventPreview liveEventPreview = new()
{
    AccessControl = new LiveEventPreviewAccessControl(
        ip: new IPAccessControl(
            allow: new IPRange[]
            {
                 // re-use the same range here for the sample, but in production you can lock this to the IPs of your 
                // devices that would be monitoring the live preview. 
                allAllowIPRange
            }
        )
    )
};

// To get the same ingest URL for the same LiveEvent name:
// 1. Set useStaticHostname to true so you have ingest like: 
//        rtmps://liveevent-hevc12-eventgridmediaservice-usw22.channel.media.azure.net:2935/live/522f9b27dd2d4b26aeb9ef8ab96c5c77           
// 2. Set the inputs:accessToken to a desired GUID string (with or without hyphen) to make it simpler to update your encoder settings

// See REST API documentation for details on each setting value
// https://docs.microsoft.com/rest/api/media/liveevents/create 

LiveEvent liveEvent = new(
    location: mediaService.Location,
    description: "Sample LiveEvent from .NET SDK sample",
    // Set useStaticHostname to true to make the ingest and preview URL host name the same. 
    // This can slow things down a bit. 
    useStaticHostname: true,

    // 1) Set up the input settings for the Live event...
    input: new LiveEventInput(
        streamingProtocol: LiveEventInputProtocol.RTMP,  // options are RTMP or Smooth Streaming ingest format.
                                                         // This sets a static access token for use on the ingest path. 
                                                         // Combining this with useStaticHostname:true will give you the same ingest URL on every creation.
                                                         // This is helpful when you only want to enter the URL into a single encoder one time for this Live Event name
        accessToken: "acf7b6ef-8a37-425f-b8fc-51c2d6a5a86a",  // Use this value when you want to make sure the ingest URL is static and always the same. If omitted, the service will generate a random GUID value.
        accessControl: liveEventInputAccess, // controls the IP restriction for the source encoder.
        keyFrameIntervalDuration: "PT2S" // Set this to match the ingest encoder's settings
    ),
    // 2) Set the live event to use pass-through or cloud encoding modes...
    encoding: new LiveEventEncoding(
        // Set this to Standard (720P) or Premium1080P to use the cloud live encoder.
        // See https://go.microsoft.com/fwlink/?linkid=2095101 for more information
        // Otherwise, set to PassthroughBasic or PassthroughStandard to use the two different pass-through modes. 
        encodingType: LiveEventEncodingType.PassthroughStandard // Choose the type of live event - standard or basic pass-through, or the encoding types for 720P or 1080P
                                                                // OPTIONAL settings when using live cloud encoding type:
                                                                // keyFrameInterval: "PT2S", //If this value is not set for an encoding live event, the fragment duration defaults to 2 seconds. The value cannot be set for pass-through live events.
                                                                // presetName: null, // only used for custom defined presets. 
                                                                //stretchMode: "None" // can be used to determine stretch on encoder mode
    ),
    // 3) Set up the Preview endpoint for monitoring based on the settings above we already set.
    preview: liveEventPreview,
    // 4) Set up more advanced options on the live event. Low Latency is the most common one.
    streamOptions: new List<StreamOptionsFlag?>()
    {
        // Set this to Default or Low Latency
        // When using Low Latency mode, you must configure the Azure Media Player to use the 
        // quick start heuristic profile or you won't notice the change. 
        // In the AMP player client side JS options, set -  heuristicProfile: "Low Latency Heuristic Profile". 
        // To use low latency optimally, you should tune your encoder settings down to 1 second GOP size instead of 2 seconds.
        StreamOptionsFlag.LowLatency
    }
//,
// 5) Optionally enable live transcriptions if desired. This is only supported on PassthroughStandard, and the transcoding live event types. It is not supported on Basic pass-through type.
// WARNING : This is extra cost ($$$), so please check pricing before enabling.
/*transcriptions:new List<LiveEventTranscription>(){
    new LiveEventTranscription(
        // The value should be in BCP-47 format (e.g: 'en-US'). See https://go.microsoft.com/fwlink/?linkid=2133742
        language: "en-us",
        outputTranscriptionTrack : new LiveEventOutputTranscriptionTrack(
            trackName: "English" // set the name you want to appear in the output manifest
        )
    )
}*/
);

// Start monitoring LiveEvent events using Event Grid and Event Hub
try
{
    // Please refer README for Event Hub and storage settings.
    // A storage account is required to process the Event Hub events from the Event Grid subscription in this sample.

    // Create a new host to process events from an Event Hub.
    Console.WriteLine("Creating a new client to process events from an Event Hub...");
    var credential = new DefaultAzureCredential();
    var storageConnectionString = string.Format("DefaultEndpointsProtocol=https;AccountName={0};AccountKey={1}",
       config.StorageAccountName, config.StorageAccountKey);
    var blobContainerName = config.StorageContainerName;
    var eventHubsConnectionString = config.EventHubConnectionString;
    var eventHubName = config.EventHubName;
    var consumerGroup = config.EventHubConsumerGroup;

    storageClient = new BlobContainerClient(
        storageConnectionString,
        blobContainerName);

    processorClient = new EventProcessorClient(
        storageClient,
        consumerGroup,
        eventHubsConnectionString,
        eventHubName);

    mediaEventProcessor = new MediaServicesEventProcessor(null, null, liveEventName);
    processorClient.ProcessEventAsync += mediaEventProcessor.ProcessEventsAsync;
    processorClient.ProcessErrorAsync += mediaEventProcessor.ProcessErrorAsync;

    await processorClient.StartProcessingAsync();
}
catch (Exception e)
{
    Console.WriteLine("Failed to connect to Event Hub, please refer README for Event Hub and storage settings. Skipping event monitoring...");
    Console.WriteLine(e.Message);
}

Console.WriteLine("Creating the LiveEvent, please be patient as this can take time to complete async.");
Console.WriteLine("Live Event creation is an async operation in Azure and timing can depend on resources available.");

// When autostart is set to true, the Live Event will be started after creation. 
// That means, the billing starts as soon as the Live Event starts running. 
// You must explicitly call Stop on the Live Event resource to halt further billing.
// The following operation can sometimes take awhile. Be patient.
// On optional workflow is to first call allocate() instead of create. 
// https://docs.microsoft.com/en-us/rest/api/media/liveevents/allocate 
// This allows you to allocate the resources and place the live event into a "Standby" mode until 
// you are ready to transition to "Running". This is useful when you want to pool resources in a warm "Standby" state at a reduced cost.
// The transition from Standby to "Running" is much faster than cold creation to "Running" using the autostart property.
// Returns a long running operation polling object that can be used to poll until completion.

Stopwatch watch = Stopwatch.StartNew();
liveEvent = await client.LiveEvents.CreateAsync(
    config.ResourceGroup,
    config.AccountName,
    liveEventName,
    liveEvent,
    // When autostart is set to true, you should "await" this method operation to complete. 
    // The Live Event will be started after creation. 
    // You may choose not to do this, but create the object, and then start it using the standby state to 
    // keep the resources "warm" and billing at a lower cost until you are ready to go live. 
    // That increases the speed of startup when you are ready to go live. 
    autoStart: false);
watch.Stop();
string elapsedTime = String.Format(":{0:00}.{1:00}", watch.Elapsed.Seconds, watch.Elapsed.Milliseconds / 10);
Console.WriteLine($"Create Live Event run time : {elapsedTime}");

Get ingest URLs

After the Live Event is created, you can get ingest URLs that you'll provide to the live encoder. The encoder uses these URLs to input a live stream.

// Get the RTMP ingest URL to configure in OBS Studio. 
// The endpoints is a collection of RTMP primary and secondary, and RTMPS primary and secondary URLs. 
// to get the primary secure RTMPS, it is usually going to be index 3, but you could add a loop here to confirm...
string ingestUrl = liveEvent.Input.Endpoints.First().Url;
Console.WriteLine($"The RTMP ingest URL to enter into OBS Studio is:");
Console.WriteLine($"\t{ingestUrl}");
Console.WriteLine("Make sure to enter a Stream Key into the OBS studio settings. It can be any value or you can repeat the accessToken used in the ingest URL path.");
Console.WriteLine();

Get the preview URL

Use previewEndpoint to preview and verify that the input from the encoder is being received.

Important

Make sure that the video is flowing to the preview URL before you continue.

// Use the previewEndpoint to preview and verify
// that the input from the encoder is actually being received
// The preview endpoint URL also support the addition of various format strings for HLS (format=m3u8-cmaf) and DASH (format=mpd-time-cmaf) for example.
// The default manifest is Smooth. 
string previewEndpoint = liveEvent.Preview.Endpoints.First().Url;
Console.WriteLine($"The preview url is:");
Console.WriteLine($"\t{previewEndpoint}");
Console.WriteLine();

Console.WriteLine($"Open the live preview in your browser and use the Azure Media Player to monitor the preview playback:");
Console.WriteLine($"\thttps://ampdemo.azureedge.net/?url={previewEndpoint}&heuristicprofile=lowlatency");
Console.WriteLine();

Create and manage live events and live outputs

After you have the stream flowing into the live event, you can begin the streaming event by creating an asset, live output, and streaming locator. This will archive the stream and make it available to viewers through the streaming endpoint.

When you're learning these concepts, it's helpful to think of the asset object as the tape that you would insert into a video tape recorder in the old days. The live output is the tape recorder machine. The live event is just the video signal coming into the back of the machine.

You first create the signal by creating the live event. The signal is not flowing until you start that live event and connect your encoder to the input.

The "tape" can be created at any time. It's just an empty asset that you'll hand to the live output object, the "tape recorder" in this analogy.

The "tape recorder" can also be created at any time. You can create a live output before starting the signal flow, or after. If you need to speed up things, it's sometimes helpful to create the output before you start the signal flow.

To stop the "tape recorder," you call delete on LiveOutput. This action doesn't delete the contents of the "tape" (asset). The asset is always kept with the archived video content until you call delete explicitly on the asset itself.

The next section will walk through the creation of the asset and the live output.

Create an asset

Create an asset for the live output to use. In our analogy, this will be the "tape" that we record the live video signal onto. Viewers will be able to see the contents live or on demand from this virtual tape.

// Create an Asset for the LiveOutput to use. Think of this as the "tape" that will be recorded to. 
// The asset entity points to a folder/container in your Azure Storage account. 
Console.WriteLine($"Creating an asset named {assetName}");
Console.WriteLine();
Asset asset = await client.Assets.CreateOrUpdateAsync(config.ResourceGroup, config.AccountName, assetName, new Asset());

Create a live output

Live outputs start when they're created and stop when they're deleted. When you delete the live output, you're not deleting the underlying asset or content in the asset. Think of it as ejecting the "tape." The asset with the recording will last as long as you like. When it's ejected (meaning, when the live output is deleted), it will be available for on-demand viewing immediately.

// Create the Live Output - think of this as the "tape recorder for the live event". 
// Live outputs are optional, but are required if you want to archive the event to storage,
// use the asset for on-demand playback later, or if you want to enable cloud DVR time-shifting.
// We will use the asset created above for the "tape" to record to. 
string manifestName = "output";
Console.WriteLine($"Creating a live output named {liveOutputName}");
Console.WriteLine();

watch = Stopwatch.StartNew();
// See the REST API for details on each of the settings on Live Output
// https://docs.microsoft.com/rest/api/media/liveoutputs/create
LiveOutput liveOutput = new(
    assetName: asset.Name,
    manifestName: manifestName, // The HLS and DASH manifest file name. This is recommended to set if you want a deterministic manifest path up front.
                                // archive window can be set from 3 minutes to 25 hours. Content that falls outside of ArchiveWindowLength
                                // is continuously discarded from storage and is non-recoverable. For a full event archive, set to the maximum, 25 hours.
    archiveWindowLength: TimeSpan.FromHours(1)
);
liveOutput = await client.LiveOutputs.CreateAsync(
    config.ResourceGroup,
    config.AccountName,
    liveEventName,
    liveOutputName,
    liveOutput);
elapsedTime = String.Format(":{0:00}.{1:00}", watch.Elapsed.Seconds, watch.Elapsed.Milliseconds / 10);
Console.WriteLine($"Create Live Output run time : {elapsedTime}");
Console.WriteLine();

Create a streaming locator

When your Media Services account is created, a default streaming endpoint is added to your account in the stopped state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the running state.

When you publish the asset by using a streaming locator, the live event (up to the DVR window length) will continue to be viewable until the streaming locator's expiration or deletion, whichever comes first. This is how you make the virtual "tape" recording available for your viewing audience to see live and on demand. The same URL can be used to watch the live event, the DVR window, or the on-demand asset when the recording is complete (when the live output is deleted).

Console.WriteLine($"Creating a streaming locator named {streamingLocatorName}");
Console.WriteLine();

IList<string> filters = new List<string>
{
    drvAssetFilterName
};
StreamingLocator locator = await client.StreamingLocators.CreateAsync(config.ResourceGroup,
    config.AccountName,
    drvStreamingLocatorName,
    new StreamingLocator
    {
        AssetName = assetName,
        StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly,
        Filters = filters   // Associate the dvr filter with StreamingLocator.
    });

// Get the default Streaming Endpoint on the account
StreamingEndpoint streamingEndpoint = await client.StreamingEndpoints.GetAsync(config.ResourceGroup, config.AccountName, streamingEndpointName);

// If it's not running, Start it. 
if (streamingEndpoint.ResourceState != StreamingEndpointResourceState.Running)
{
    Console.WriteLine("Streaming Endpoint was Stopped, restarting now..");
    await client.StreamingEndpoints.StartAsync(config.ResourceGroup, config.AccountName, streamingEndpointName);

    // Since we started the endpoint, we should stop it in cleanup.
    stopEndpoint = true;
}

// Get the URL to stream the output
ListPathsResponse paths = await client.StreamingLocators.ListPathsAsync(resourceGroupName, accountName, locatorName);

foreach (StreamingPath path in paths.StreamingPaths)
{
    UriBuilder uriBuilder = new UriBuilder();
    uriBuilder.Scheme = "https";
    uriBuilder.Host = streamingEndpoint.HostName;

    uriBuilder.Path = path.Paths[0];
    // Get the URL from the uriBuilder: uriBuilder.ToString()
}

Clean up resources in your Media Services account

If you're done streaming events and want to clean up the resources provisioned earlier, use the following procedure:

  1. Stop pushing the stream from the encoder.
  2. Stop the live event. After the live event is stopped, it won't incur any charges. When you need to start it again, it will have the same ingest URL so you won't need to reconfigure your encoder.
  3. Stop your streaming endpoint, unless you want to continue to provide the archive of your live event as an on-demand stream. If the live event is in a stopped state, it won't incur any charges.
private static async Task CleanupLiveEventAndOutputAsync(IAzureMediaServicesClient client, string resourceGroup, string accountName, string liveEventName, string liveOutputName)
{
    try
    {
        LiveEvent liveEvent = await client.LiveEvents.GetAsync(resourceGroup, accountName, liveEventName);

        Console.WriteLine("Deleting Live Output");
        Stopwatch watch = Stopwatch.StartNew();

        await client.LiveOutputs.DeleteAsync(resourceGroup, accountName, liveEventName, liveOutputName);

        String elapsedTime = String.Format(":{0:00}.{1:00}", watch.Elapsed.Seconds, watch.Elapsed.Milliseconds / 10);
        Console.WriteLine($"Delete Live Output run time : {elapsedTime}");

        if (liveEvent != null)
        {
            if (liveEvent.ResourceState == LiveEventResourceState.Running)
            {
                watch = Stopwatch.StartNew();
                // If the LiveEvent is running, stop it and have it remove any LiveOutputs
                await client.LiveEvents.StopAsync(resourceGroup, accountName, liveEventName, removeOutputsOnStop: false);
                elapsedTime = String.Format(":{0:00}.{1:00}", watch.Elapsed.Seconds, watch.Elapsed.Milliseconds / 10);
                Console.WriteLine($"Stop Live Event run time : {elapsedTime}");
            }

            // Delete the LiveEvent
            await client.LiveEvents.DeleteAsync(resourceGroup, accountName, liveEventName);
        }
    }
    catch (ErrorResponseException e)
    {
        Console.WriteLine("CleanupLiveEventAndOutputAsync -- Hit ErrorResponseException");
        Console.WriteLine($"\tCode: {e.Body.Error.Code}");
        Console.WriteLine($"\tCode: {e.Body.Error.Message}");
        Console.WriteLine();
    }
}
private static async Task CleanupLocatorandAssetAsync(IAzureMediaServicesClient client, string resourceGroup, string accountName, string streamingLocatorName, string assetName)
{
    try
    {
        // Delete the Streaming Locator
        await client.StreamingLocators.DeleteAsync(resourceGroup, accountName, streamingLocatorName);

        // Delete the Archive Asset
        await client.Assets.DeleteAsync(resourceGroup, accountName, assetName);
    }
    catch (ErrorResponseException e)
    {
        Console.WriteLine("CleanupLocatorandAssetAsync -- Hit ErrorResponseException");
        Console.WriteLine($"\tCode: {e.Body.Error.Code}");
        Console.WriteLine($"\tCode: {e.Body.Error.Message}");
        Console.WriteLine();
    }
}

Watch the event

Press Ctrl+F5 to run the code. This will output streaming URLs that you can use to watch your live event. Copy the streaming URL that you got to create a streaming locator. You can use a media player of your choice. Azure Media Player is available to test your stream at the Media Player demo site.

A live event automatically converts events to on-demand content when it's stopped. Even after you stop and delete the event, users can stream your archived content as a video on demand for as long as you don't delete the asset. An asset can't be deleted if an event is using it; the event must be deleted first.

Clean up remaining resources

If you no longer need any of the resources in your resource group, including the Media Services and storage accounts that you created for this tutorial, delete the resource group that you created earlier.

Run the following CLI command:

az group delete --name amsResourceGroup

Important

Leaving the live event running incurs billing costs. Be aware that if the project or program stops responding or is closed out for any reason, it might leave the live event running in a billing state.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK