Showing posts with label youtube. Show all posts
Showing posts with label youtube. Show all posts

Recap/ Summary of Week #30

Every beginning of weekend, I will share a recap of the week and at the same time a summary of my streams. Those videos are at least two hour longs, so I thought a short summary to know if topic interest you could be useful. Watch only the summary or relax and enjoy the longer version that up to you!

   

My journey as a technical YouTuber


Sixty-five videos and one year later
One year already that I start sharing videos on YouTube, I've been blogging for many years and wanted to try something new. Creating videos seems like the next logical step. And this is how I decided to start sharing short videos to answer technical questions about Microsoft Azure on YouTube. This post, I will explain what I learned during the first year of my journey.

The Beginning

I started my YouTube channel in French. During the two first months, I published one video by week. I learned a lot through this period, how to prepare my code snippets, my files, and my screen. The biggest takeaway was that: preparation is the key. The more prepared you are, the more efficient you will be in front of the camera. I also discover that compare to writing a blog post; you need to pay more attention, way earlier in the process, to the tiny details.

It may sound like a cliché, but it was challenging to watch myself. But it's the best way to improve. You need to accept those mistakes and promise yourself that next time, you will do better. It's normal and okay not to be perfect.

Let's be bilingual

One thing that I was looking forward was to be able to share my videos with all my clients/ friends/ community members. However, I couldn't because while some are perfectly comfortable in French (and only French), others only understand English. And this is why I decided to record all my videos in both French and English. I knew that would mean doubling the workload, so I adapted my schedule to publish every second week.

Doing all does videos I found my beat, my style, the things I like and dislike. I also spend a lot of time learning the YouTube platform through books and by watching other YouTubers. It's such an incredible resourceful community!

A New Camera

The software (Camtasia from TechSmith)  I was using, and that I'm still using today, can record both your screen and your webcam. It keeps all the tracks separated, and this is great because you can change the size and position of the mortise in post-production.

My challenge was that I wanted to record both in 1080P (full HD) because during my intros and conclusion when the webcam input fill the screen I want good image quality. However, the software (or more likely my PC), did not let me record in 1080P (full HD) the screen and the webcam input.
To upgrade the quality of my recordings, I decided to use my DSLR instead of my webcam. Of course, it represents more work in post-production because now I need to synchronize the two videos sources, but now everything is 1080.

The famous algorithm

The more I was studying my statistics and reading comments (thank you all for your constructive comments by the way), I started to understand that having all my videos in French and English on the same channel was not the best idea.

First, for the YouTube algorithm, it was very confusing because it couldn't identify the video's language. Therefore, it's harder to do its job and make recommendations.

Secondly, it didn't cross my mind at the beginning, because I understand both languages, but for a subscriber that understand only one of the language, it's was very confusing and not very pleasing to have all those "others" un-useful videos in the feed.

After a very long (and hard) reflection, I decided to create a second channel and "move" all my French content to that channel. I say "move" but you cannot "move" contents on YouTube, and that was the principal reason why I was hesitating to split my channel. By creating a new channel, and re-uploading my older (aka French) videos I was losing all my stats (views, subscribers), and references.

The effect was immediate on the English channel as the views and subscribers exploded a week after. And on the other side, my brand new French channel didn't really catch-up. Never the less I intend to continue to create French content. ;)

Today's Status

As this post gets published, I have a cumulative total of sixty-five videos online and more than a thousand subscribers. I publish every second week, usually on Thursday, two videos on the same topic one in French on fr.cloud5mins.com and one in English on cloud5mins.com. I have a fantastic community super supportive, asking questions and suggesting topics.



I'm very thankful for all that positive feedback. To help me you can subscribe to my YouTube channel of course, or become one of my Patreon at patreon.com/fboucheros

What's Next

I'm definitely continuing to create videos. I plan to start doing some streaming on Twitch, where I will building cloud solutions.

See you soon,


How to Deploy your Azure Functions Faster and Easily with Zip Push

Azure functions are great. I used to do a lot of "csx" version (C# scripted version) but more recently I switched to the compile version, and I definitely loved it! However, I was looking for a way to keep my deployment short and sweet, because sometimes I don't have time to setup a "big" CI/CD or simply because sometimes I'm not the one doing the deployment... In those cases, I need a simple script that will deploy everything! In this post, I will share with you how you can deploy everything with one easy script.

The Context


In this demo, I will deploy a simple C# (full .Net framework) Azure functions. I will create the Azure Function App and storage using an Azure Resource Manager (ARM template) and deploy with a method named Zip push or ZipDeploy. All the code, script, a template is available on my Github.

The Azure Functions Code


The Azure Function doesn't have to be special, and it can be any language supported by Azure Functions. Simply to show you everything, here the code of my function.


namespace AzFunctionZipDeploy
{
    public static class Function1
    {
        [FunctionName("GetTopRunner")]
        public static async Task Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
        {
            log.Info("C# HTTP trigger function processed a request.");

            string top = req.GetQueryNameValuePairs()
                .FirstOrDefault(q => string.Compare(q.Key, "top", true) == 0)
                .Value;

            if (top == null)
            {
                dynamic data = await req.Content.ReadAsAsync< object>();
                top = data?.top;
            }

        return top == null
                ? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a number to get your top x runner on the query string or in the request body")
                : req.CreateResponse(HttpStatusCode.OK, new { message = $"Hello, here is your Top {top} runners", runners = A.ListOf(int.Parse(top)) });
        }
    }

    class Person
    {
        public string FirstName { get; set; }
        public string LastName { get; set; }
        public int Age { get; set; }
    }
}

It's a really simple function that will return a list of Person generated on the fly. The list will contain as many person as the number passed in parameter. I'm using the very useful GenFu library, from my buddies: ASP.NET Monsters.

The only thing we need to do is to create our compress file (Zip or Rar) that contains everything our project required.

createZip

In this case, it's the project file (AzFunction-ZipDeploy.csproj), the function's code (Function1.cs) the host (host.json) and local settings of our function (local.settings.json).

The ARM template


For this demo, we need one Azure Function App. I will use a template that is part of the Azure Quickstart Templates. A quick look to the azuredeploy.parameters.json file and we see that the only parameter we really need to set is the name of our application.


{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "appName": {
        "value": "zipdeploydemo"
        }
    }
}

To be able to ZipDeploy, we need to add one Application Setting to let the Kudu interface we need its help to compile our code. To do that let's open the azuredeploy.json and go to the appSettings section. We need to add a new variable named: SCM_DO_BUILD_DURING_DEPLOYMENT and set it to true. After adding the setting it should look like this (see the last one... that's our new one):


"appSettings": [
    {
    "name": "AzureWebJobsDashboard",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
    },
    {
    "name": "AzureWebJobsStorage",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
    },
    {
    "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountid'),'2015-05-01-preview').key1)]"
    },
    {
    "name": "WEBSITE_CONTENTSHARE",
    "value": "[toLower(variables('functionAppName'))]"
    },
    {
    "name": "FUNCTIONS_EXTENSION_VERSION",
    "value": "~1"
    },
    {
    "name": "WEBSITE_NODE_DEFAULT_VERSION",
    "value": "6.5.0"
    },
    {
    "name": "SCM_DO_BUILD_DURING_DEPLOYMENT",
    "value": true
    }
]

The Deployment Script


Now that all the pieces are ready it's time to put it together one script. In fact, only the two last commands are required; everything else is just stuff to make it easier to re-use it. Check out my previous post 5 Simple Steps to Get a Clean ARM Template, to learn more about the best practices related to ARM template. So let's see that script, it's pretty simple.

    # script to Create an Azure Gramophone-PoC Solution

    resourceGroupName=$1
    resourceGroupLocation=$2

    templateFilePath="./arm/azuredeploy.json"
    parameterFilePath="./arm/azuredeploy.parameters.json"

    dateToken=`date '+%Y%m%d%H%M'`
    deploymentName="FrankDemo"$dateToken

    # az login

    # You can select a specific subscription if you do not want to use the default
    # az account set -s SUBSCRIPTION_ID

    if !( $(az group exists -g  $resourceGroupName) ) then
        echo "---> Creating the Resourcegroup: " $resourceGroupName
        az group create -g $resourceGroupName -l $resourceGroupLocation
    else
        echo "---> Resourcegroup:" $resourceGroupName "already exists."
    fi

    az group deployment create --name $deploymentName --resource-group $resourceGroupName --template-file $templateFilePath --parameters $parameterFilePath --verbose

    echo "---> Deploying Function Code"
    az functionapp deployment source config-zip -g $resourceGroupName -n zipdeploydemo --src "./zip/AzFunction-ZipDeploy.zip"

    echo "---> done <--- code="">

The only "new" thing is the last command functionapp deployment source config-zip. That where we specify to the Azure Function App to look to --src to get our source. Because I'm running it locally, the path is pointing to a local folder. However, you could execute this command also in the CloudShell, and that would become a URI... to an Azure Blob Storage by example.

Deploy and Test


If you didn't notice yet, I did my script in bash and Azure CLI. That because I want my script to be compatible with all platforms. Of course, you could have done it in PowerShell or anything else that would call the REST API.

To deploy, just execute the script passing the ResourceGroup name, and its location.

    ./Deploy-AZ-Gramophone.sh cloud5mins eastus

ScriptOutputs

To get to Function URL, go to the Azure portal (portal.azure.com) and click on the Function App that we just deploy. Click on the function GetTopRunner in this case, and click on the </> Getfunction URL button.

GetFunctionURL

Use that URL in postman and pass another parameter top to see we the deployment ws successful.

postmanTest

In Video Please


If you prefer, I also have a video version of this post.



~Enjoy!

How to copy files between Azure subscription from Windows, Linux, OS X, or the cloud

(en français: ici)

Copy, Download or Upload from-to any combination of Windows, Linux, OS X, or the cloud


Data is and will always be our primary concern. Whether shaped as text files, images, VM VHDs or any other ways, at some point in time, our data will need to be moved. I already wrote about it previously, and the content of this post is still valuable today, but I wanted to share new options and convert all ground (meaning Linux, Windows and OS X).

Scenarios


Here few scenarios why you would want to move data.
  • Your Microsoft Azure trial is ending, and you wish to keep all the data.
  • You are creating a new web application, and all those images need to be moved to the Azure subscription.
  • You have a Virtual Machine that you would like to move to the cloud or to a different subscription.
  • ...

AZCopy


AzCopy is a fantastic command-line tool for copying data to and from Microsoft Azure Blob, File, and Table storage. At the moment, to write this post AzCopy is only available for Windows users. Another solution will be introduced later in this post for Mac and Linux users. Before AzCopy was only available on Windows. However, recently a second version built with .NET Core Framework is available. The commands are very similar but not exactly the same.

AzCopy on Windows


In his simplest expression, an AzCopy command looks like this:
AzCopy /Source:<source> /Dest:<destination> [Options]
If you earlier have installed an Azure SDK on your machine, you already have it. By default, AzCopy is installed to %ProgramFiles(x86)%\Microsoft SDKs\Azure\AzCopy (64-bit Windows) or %ProgramFiles%\Microsoft SDKs\Azure\AzCopy (32-bit Windows).

If you need only AzCopy for a server, you can download the latest version of AzCopy.
Let's see some frequent usage. First let's say you need do move all those images from your server to an Azure blob storage.
AzCopy /Source:C:\MyWebApp\images /Dest:https://frankysnotes.blob.core.windows.net/blog /DestKey:4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== /S
CopyAllImages

Then to copy those images to another subscription very easy.
AzCopy /Source:https://frankysnotes.blob.core.windows.net/blog /Dest:https://frankshare.blob.core.windows.net/imagesbackup /SourceKey:4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== /DestKey:EwXpZ2uZ3zrjEbpBGDfsefWkj3G2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== /S

AzCopy Parameters


These examples were simple, but AzCopy is a very powerful tool. I invite you to type one of the following commands to discover more about using AzCopy:
  • For detailed command-line help for AzCopy: AzCopy /?
  • For command-line examples: AzCopy /?:Samples

AzCopy on Linux


Before you could install AzCopy you will need to install .Net Core. This is done very simply with few commands.
curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg
sudo mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg
sudo sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/repos/microsoft-ubuntu-xenial-prod xenial main" > /etc/apt/sources.list.d/dotnetdev.list'
sudo apt-get update
sudo apt-get install dotnet-sdk-2.0.2
Then to install it, you just need to get it with a wget command, unzip it, and execute the install script.
wget -O azcopy.tar.gz https://aka.ms/downloadazcopyprlinux 
tar -xf azcopy.tar.gz 
sudo ./install.sh
In his simplest expression, the .Net Core version of AzCopy command looks like this:
azcopy --source <source> --destination <destination> [Options]
It is very similar to the original version, but parameters are using -- and - instead of the / and where a : was required, it's now a simple space.

Uploading to Azure


Here an example, to copy a single file GlobalDevopsBootcamp.jpg to an Azure Blob Storage. We pass the full local path to the file into --source, the destination is the full URI, and finally the destination blob storage key. Of course, you could also use SAS token if you prefer.
azcopy \
--source /home/frank/demo/GlobalDevopsBootcamp.jpg \
--destination https://frankysnotes.blob.core.windows.net/blog/GlobalDevopsBootcamp.jpg \
--dest-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== 

Copying Between Azure Subscriptions


To copy the image to a second Azure subscription, we use the command the source is now an Azure Storage URI, and we pass the source and the destination keys:
azcopy \
--source https://frankysnotes.blob.core.windows.net/blog/GlobalDevopsBootcamp.jpg \
--destination https://frankshare.blob.core.windows.net/imagesbackup/GlobalDevopsBootcamp.jpg \
--source-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== \
--dest-key EwXpZ2uZ3zrjEbpBGDfsefWkj3G2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== 

Azure CLI


Azure CLI is a set of cross-platform commands for the Azure Platform. It gives tools to manipulate all Azure components, but this post will focus on azure storage features.

There are two versions of the Azure Command-Line Interface (CLI) currently available:

  • Azure CLI 2.0: written in Python, conpatible only with the Resource Manager deployment model.
  • Azure CLI 1.0: written in Node.js, compatible with both the classic and Resource Manager deployment models.

Azure CLI 1.0 is deprecated and should only be used for support with the Azure Service Management (ASM) model with "classic" resources.

Installing Azure CLI


Let's start by installing Azure CLI. Of course, you can download an installer but since everything is evolving very fast with not getting it from Node Package Manager (npm). The install will be the same, you just need to specify the version if you absolutely need Azure CLI 1.0.

sudo npm install azure-cli -g

AzureCli_installed

To keep the previous scenario, let's try to copy all images to a blob storage. Unfortunately, Azure CLI doesn't offer the same flexibility as AzCopy,and you must upload the file one by one. However, to upload all images from a folder, we can easily put the command in a loop.

for f in Documents/images/*.jpg
do
   azure storage blob upload -a frankysnotes -k YoMjXMDe+694FGgOaN0oaRdOF6s1ktMgkB6pBx2vnAr8AOXm3HTF7tT0NQWvGrWnWj5m4X1U0HIPUIAA==  $f blogimages
done

azurecli_allimages

In the previous command -a was the account name, and -k was the Access key. This two information can easily be found in the Azure portal. From the portal (https://portal.azure.com), select the storage account. In the right band click-on Access keys.

StorageAccessKeys

To copy a file (ex: a VM disk aka VHD) from one storage to another one in a different subscription or region, it's really easy. This time we will use the command azure storage blob copy start and the -a and -k are related to our destination.

azure storage blob copy start 'https://frankysnotes.blob.core.windows.net/blogimages/20151011_151451.MOV' imagesbackup -k EwXpZ2uZ3zrjEbpBGDfsefWkj3GnuFdPCt2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== -a frankshare

The nice thing about this command is that it's asynchronous. To see the status of your copy just execute the command azure storage blob copy show

azure storage blob copy show -a frankshare -k YoMjXMDe+694FGgOaN0oPaRdOF6s1ktMgkB6pBx2vnAr8AOXm3HTF7tT0NQVxsqhWvGrWnWj5m4X1U0HIPUIAA== imagesbackup 20151011_151451.MOV

CopyStatus1


CopyStatus2

Azure CLI 2.0 (Windows, Linux, OS X, Docker, Cloud Shell)


The Azure CLI 2.0 is Azure's new command-line optimized for managing and administering Azure resources that work against the Azure Resource Manager. Like the previous version, it will work perfectly on Windows, Linux, OS X, Docker but also from the Cloud Shell!


Cloud Shell is available right from the Azure Portal, without any plugging.

Uploading to Azure


The command if the same as the previous version except that now the command is named az. Here an example to upload a single file into an Azure Blob Storage.

az storage blob upload --file /home/frank/demo/CloudIsStrong.jpg \
--account-name frankysnotes \
--container-name blogimages --name CloudIsStrong.jpg \
--account-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA==

Copying Between Subscriptions


Let's now copy the file to another Azure subscription. A think to be aware is that --account-name and --account-key are for the destination, even if it's not specified.
az storage blob copy start \
--source-account-name frankysnotes  --source-account-key 4YvvYDTg3UUpky8Rj5bDG4KO/R1FdtssxVnunsEd/4rAS04V2LkO0F8mXbddAv39WtCo5LW6JyvfhA== \
--source-container blogimages --source-blob CloudIsStrong.jpg   \
--account-name frankshare  --account-key EwXpZ2uZ3zrjEbpBGDfsefWkj3G2QY5fJcb6kMqV2A0+2TsGno+mk9vEXc5Uw1XiouvAiTS7Kr5OGzA== \
--destination-container imagesbackup  \
--destination-blob CloudIsStrong.jpg 

In Video Please!


If you prefer, I also have a video version of that post.



One More Thing


Sometimes, we don't need to script things, and a graphic interface is much better. For this kind of situation, the must is the Azure Storage Explorer. It does a lot! Upload, download, and manage blobs, files, queues, tables, and Cosmos DB entities. And it works on Windows, macOS, and Linux!


It's just the beginning


This post was just an introduction to two very powerful tools. I strongly suggest to go read in the official documentation to learn more. Use the comment to share all your questions and suggestion.

References:

Reading Notes #313

roy_sky_ansi2Suggestion of the week


Cloud


Programming


Databases


Miscellaneous


How to Automatically Generate Video Sub-Title in Another Language

I recently started a French YouTube channel. Quickly, I got a message asking to add English sub-title, and got also a suggestion to leverage Azure Logic App and some Cognitive Services to help me in that task. I really liked the idea, so I gave it a shot. I recorded myself and in twenty minutes I was done. Even though, it was not the success I was hoping for, the application works perfectly. It's just that speaking in French with a lot of English technical word was a little bite too hard for the Video Indexer. However, If you are speaking only one language in your video that solution would work perfectly. In this post, I will show you how to create that Logic App with Azure Video Indexer and Cognitive Services.

The Idea


Once a video is dropped in an OneDrive folder (or any file system accessible from Azure), a Logic App will get triggered and uploads the file to the Azure Video Indexer, generate a Video Text Tracks (VTT) file, and save this new file in another folder. A second Logic App will get started and use the Translator Text API from Azure Cognitive Service to translate the VTT file, and save it into the final folder.

GenerateSubTitle


The Generation


Before getting started, you will need to create your Video Indexer API. To do this, login to the Video Indexer developer portal, and subscribe at the Video Indexer APIs - Production in the Product tab. You should then get your API keys.


ApiKeyPage

To get more detail on the subscription refer to the documentation. To know the names, parameters, code sample to all the methods available in your new API, click on APIs tab.

APIDetails

Now let's create our first Logic App. I always prefer to start with a blank template, but take what fits you. Any Online file system's trigger will do, in this case I'm using the When a file is created from OneDrive. I got some issue with the trigger. It was not always getting fired by a new file. I tried the When a file is modified trigger, but it didn't solve the problem. If you think, you know what I was doing wrong feel free to leave a comment :).

First reel action is to upload the file to the Azure Video Indexer. We can to that ery easily by using the method Upload video and and index, passing the name and content from the trigger.

Of course, the longer is the video the longer will be the process, so we will need to wait. A way to do that is by adding a waiting loop. Will use the method Get processing state from the Video Indexer and loop until the status is processed. To slow down your loop just add a wait action and set it at tree or five minutes.

When the file is completely processed, it will be time to retrieve the VTT file. This is done in two simple step. First, we will get the URL by calling the method Get the transcript URL, then with a simple HTTP GET we will download the file. The last thing we will need to do will be to save it in a folder where our second Logic App will be watching for new drop.

In the visual designer, the Logic App should look to this.

LogicAppGenerateVTT


The Translation


The second Logic App is very short. Once again, it will get triggered by a new file trigger in our OneDrive Folder. Then it will be time to call our Translator Text API from Azure Cognitive Service. That's to the great Logic App interface it's very intuitive to fill all the parameter for our call. Once we got the translation, we need to save it into our final destination.

The Logic App should look like this.

LogicAppTranslate


Conclusion


It was much easier than I expected. I really like implementing those integration projects with Logic App. It's so easy to "plug" all those APIs together with this interface. And yes like I mentioned in the introduction the result was not "great". I run test with video purely in English (even with my accent) or only in French (no mix) and the result was really good. So I think the problem is really the fact that I mix French and English. I could improve the Indexer by spending time providing files so the service could understand better my "Franglish". However, in twenty minutes, I'm really impressed by the way, in turned out. If you have idea on how to improve this solution, or if you have some questions, feel free to leave a comment. You can also watch my French YouTube video.

All the code is available online on Github - Cloud5VideoHelper.

References:



My new Youtube Channel

I've been blogging for about ten years with you now on this blog, and it's been a pleasure. More recently I started, quietly, a French blog named: Cloud en Français, feel free to have a look.
Today, I'm super excited to share with you my new project: A YouTube channel with original content published every week... in French! Every Wednesday I will be publishing a five-minute video to answer a problem or a question.

Come see me online, subscribe, ask questions...

Cloud en 5 minutes

youtubechannel


Je blogue depuis une dizaine d'années avec vous sur ce blogue, et c'est toujours un plaisir. Plus récemment j'ai commencé, plus silencieusement, un blogue français nommé: Cloud en Français, n'hésitez pas à jeter un coup d'œil.

Aujourd'hui, je suis super excité de partager avec vous mon nouveau projet: une chaîne YouTube avec du contenu original publié chaque semaine... en français! Chaque mercredi, je vais publier une vidéo d'environ cinq minutes pour répondre à un problème ou une question.

Venez me voir en ligne, abonnez-vous, posez des questions ...

Cloud en 5 minutes