SharePoint eDiscovery OR Records Management, but not both!

NOTE:  This blog post is context sensitive to the current state (9/2017) of on-premises SharePoint 2013 and 2016 environments and has no relation to any new evolving data labeling or retention in Office 365.  It has come to my attention some very significant features will be announced at Ignite around filling the traditional Records Management feature gaps.
Stay tuned for a post on why you should upgrade/migrate to the latest and greatest!
You see many blog posts by many of my peers.  SharePoint eDiscovery is great, Records Management is good (yeah, only good if not bad).  But what you rarely hear is the truth of the matter.
Just like in the highlander series…”there can be only one”.  As new features are added to any product, they will start to collide.  The collisions are becoming more frequent and the regression tests more infinite.
That all being said, if you only had one to choose from, eDiscovery or Records Management.  Which one would you choose?
eDiscovery is pretty awesome when taken as a single entity.  The ability to find and lock down (aka Hold) items across Exchange and SharePoint for the purposes of litigation for example.
Records Management (again as a single entity) is great in that you can enforce the users to not be able to make modifications to declared records. Albeit, only in the sense that they live in the records center or you have enabled the “In-place records management” feature in a site.
So why can you only have one?  Hmm, let’s get to the meat of it shall we?
eDiscovery holds are great in that they allow you to target content such that it does not get modified.  It does this through the somewhat well known Preservation Hold Library.  That’s all fine and dandy, but the reality that most people don’t get is that when you add a “site” as a source and then create the hold, it holds the entire site by default.  Admins (not end users) can get around this by enabling the “hidden” feature of Query based preservation.  But when you do this, it adds an entirely new issue of performance to the whole conversation, but we’ll leave that out here.
Ok, so where does the choice come in?
Well, if you want to put a hold on your entire SharePoint Farm, you have to select all the sites in your sources for the eDiscovery case.  Its doable, painful, but doable.   So what does that do?  It of course will make it such that any time a user modifies a document, it will go into preservation hold.
Ok, so what happens when you need to archive a document to the records center?  You aren’t destroying it…you aren’t modifying it (per se), you are following basic procedures to create a record.  Well, unfortunately my friends, the reality is that you will get an ugly error that the document has in fact moved to the records center, but that a link could not be created:
This is so very bad.  You now have a document sitting in the records center, but yet it is also sitting in the source site.  What happens is you send it again, which is of course what a user will try to do!  Yup, another version is created and unless you have versioning turned on, you get the crazy FileName_ZUOXUDF filename.  The original document will sit there…forever…or…until the hold is lifted and then of course, the data owner may have left the company and whala…a orphaned record that no one really knows what to do with!
Ouch.
SharePoint has never truly been a records management platform (content organizer and rules are a complete failure).
Because of this, it’s the reason four different companies formed (RecordPoint, RecordLion, Gimmal, Collabware) to solve the problems that SharePoint has had for quite some time.  I’ll skip the inadequacies of those products for now, but likely you’ll get my full opinion on them later in life…
This particular instance of features colliding though…it really pushes organizations over the proverbial limit of understanding and patience.
In reality, the way things should go is…are you sending something to a trusted records center?  Yes?  Ok cool…I’ll let that action occur because I know its going to someplace good and monitored and approved.
Not…fail no matter what cuz one feature overrides another!
Pick one…but only one.
Chris

Do you trust O365’s trust of others?

So I’m renewing my CISSP and will be making a push into the security space pretty hard in the next few months.   Part of that will be doing things like in this post.
I am deep into identity and auth flows and have been doing a ton with ADFS/OAuth with Intune etc.
A few days ago it hit me that a person has the ability to modify the claims that ADFS sends to O365.  That got me to thinking…
What will happen if O365 does its realm redirect where a user logs in as one person yet the claim for the UPN is different than the original?  Will it work and if so, what are the ramifications of it?
So, here’s my general thinking of what I want to do (didn’t end up being how to do…keep reading):

  • Setup ADFS and a federated domain in O365 (chrisgivens.com)
  • Modify the O365 ADFS claims to be someone\something other than what the actual login implied.
    • IE…I login as chris@chrisgivens.com, but the claim that is sent back is actually spoof@chrisgivens.com

  • Share a site with spoof@chrisgivens.com
  • See if it all works…

So…first part.  ADFS.  When you setup the O365 relay it adds in its own claims rules so that it gets what it is expecting.  The general set of claims that get sent are:

  • http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier
  • http://schemas.xmlsoap.org/claims/UPN
  • http://schemas.microsoft.com/LiveID/Federation/2008/05/ImmutableID

In addition to the other basic claims as seen here:

Ok, cool.  So here’s my goal.  Set it such that no matter who logs in, the identity and claims that get sent to O365 are really someone else!  That requires a bit of claim manipulation.  Here’s an example:

See what I’m doing?  I’m setting the email to be something else other than the original value from the auth’d AD user!  I do this for all the email claim fields.  The though is that Azure AD will utilize the email\UPN as its “source of truth” for who the user is.  Ok, great.  So let’s try it out!
Going to the faithful login.microsoftonline.com, I enter in a set of credentials for my federated domain, and get redirected to ADFS.  I log in to ADFS page, and the first go around, I got this error:

After removing competing claim rules and re-trying the login, this is the beauty you end up:

It worked…kinda.  Notice that id.  That is what I though was the valid ObjectGUID.  Nope, its a bit different than that.  So how the heck do I view the claims that ADFS is sending?
Well, good luck with that with the out of box logging of ADFS.  You can attempt to follow this post to turn on all the verbose logging, but its only helpful for resolving the competing claims issue or the fact I did not have the SAML Consumer setup on the replying party.
So, what do you do?  You swing over to Auth0.com and you setup your ADFS with them, then you can use all their cool debugging tools to see what is actually coming across!  Awesome…

So, after looking at the logs and seeing the original claim rule applied, I see that the ObjectGUID is base64 encoded.  So I copy it and paste it into my claim rule.
Ok, let’s try this again…I sign on, type a username and password, ADFS does its thing, I get redirected back to the site, and whala…I’m in as the other user!  Holy shizer balls, it worked…
It wasn’t what I though would happen being that I had to find the ObjectGUID and that is what Azure AD goes off of, not the email claims.
So what does this mean?  It means whoever has access to ADFS for the remote federated domain, can open up the ADSI edit tool, find the ObjectGUID for a user (or even use basic powershell such as “Get-ADUser username -Properties ObjectGUID | Select *”), paste in some rules and BAM…they are in as that user in the O365 application layer.  
They do NOT have to be a domain admin to query Active Directory for ObjectGUID, nor do you have to be a domain admin to manage ADFS.  You can simply be an ADFS Admin.
This is significant in that unlike a domain admin, when a domain admin changes a password or an AD object, the AD object changes are very likely audited and red flags thrown!
In my experience, a change to ADFS claim rules is very rarely audited and monitored and if it is, it is unlikely to throw up any major red flags.  Any ADFS admin can make a change and login at any time and bam, you can be anyone, anytime.  This leads to…log your ADFS 510 events:

Hope you trust whoever is running the target federated auth server (no matter what it is).  Or that you trust whoever you are sharing things with if they are doing federation and not Azure AD directly!
I think it would be nice to be able to set in the configuration settings that I don’t want to allow my users to share data with a domain that has federation enabled and that it must be managed by Azure AD!  Maybe even be able to set it at a very specific level such as Site/Web.

Free SharePoint Apps for Everyone!

Have had this in my back pocket for a while.  I just checked to see if it still works and sure enough…it does.  You can see I tore deep into the SP App Store design with this 4 year old post.  What I didn’t show was how one can get the app packages for free and by pass paying for the apps.
This only works with Apps that have a “trial”.  Ones that you have to pay for will not be open to this hack as you would have to buy them first.  But technically once you have bought it, you can use this same hack to then go post their app package on the internet.
How does it work you ask?  If you read through the steps of the post referenced above, one of the back end abstracted parts is to download the app package such that it can be deployed to the target web.  The app package will not be put into the database and is required to be downloaded each and every time you request it.  In the App Mgmt database you will find the RawXMLEntitlementToken.

This is generated when you click through the Microsoft billing portion of the app install.  All apps (Free, Trial, Paid) have to run through this in order to get the token.  Once you have this token, you can use it to download the app package by simply pasting this into your browser window:
https://store.office.com/appinstall/unauthenticated?cmu=en-US&clienttoken=%3cr%20v%3d%221%22%3e%3ct%20aid%3d%22WA103532495%22%20pid%3d%22%7b1d39b4be%2D60e0%2D45d3%2D9084%2D186800df12e2%7d%22%20cid%3d%225B0427C7B8C9F259%22%20did%3d%22%7b4AA4B65C%2D9085%2D41B7%2DBC31%2D4553E50D3AAA%7d%22%20ts%3d%220%22%20sl%3d%22true%22%20et%3d%22Free%22%20ad%3d%222013%2D10%2D05T01%3a32%3a24Z%22%20sd%3d%222013%2D10%2D05%22%20te%3d%222018%2D05%2D31T22%3a44%3a03Z%22%20ss%3d%220%22%20%2f%3e%3cd%3e0o9KoXMWV%2bOazF2J02jVoMDkzptamrbN9%2fIZw0m09V8%3d%3c%2fd%3e%3c%2fr%3e&ret=0&build=15.0.4763.1000&av=OSU150&origin=EC101785291&corr={96ae724f-5d59-49b3-8fcd-79191c3e1728}
This will download the .cab file for the target app.  Once you have this, you don’t need to pay for the App when the trial expires.  Just install it to your app catalog or directly into the site.
Enjoy!
Chris

Source file could not be replaced with a link – SharePoint

You may run across this in your dealings with SharePoint 2013 or 2016 when attempting to move things to the Records Center and leave a link behind.
It used to be because the file was checked out, but with all the new features of SharePoint these days, you get many components stepping on each others toes.
In this case, you will find the actual error is not bubbling up from the lower levels in the stack.  A quick search in the ULS logged will point you to this gem:
“Version of this item cannot be deleted because it is on hold”
Ah ha…remove the hold that is placed on the document (if you were testing anyway) and then you can continue the move process.  The unfortunate thing is that the document does successfully get submitted to the records center, so a resubmissions will trigger the whole versioning or _ASDSF randomness.  Such is life in SharePoint when you have end points calling other end points that live outside one another’s thread space.

Action 16.1.1.0 of Microsoft.Office.Project.Server.Database.Extension.Upgrade.PDEUpgradeSequence failed – SharePoint 2016 Upgrade

What a crazy error. 

There are a few posts about this but very little that tell you what is going on.  When you attempt to upgrade the farm with the following command (after applying a public update or even after the 2016 upgrade):

PSConfig.exe -cmd upgrade -inplace b2b -wait -force -cmd applicationcontent -install -cmd installfeatures -cmd secureresources

You could get the above error in one of the final upgrade stepsactions.  Psconfig.exe is attempting to upgrade all your service application and content databases to your current binary level.  This particular error will show up if someone in your organization was "smart" enough to decided to make a content database a project server database.  You will see this by looking at the tables in the database.  If you see any that are related to Project Server, then you know someone did something really dumb.

You can get your farm to upgrade by simple removing the offending database (aka detach it), then run the command to upgrade everything else.  

Resolution would be to remove all the project server based tables (make a backup of course) and then try your upgrade again. You could also attempt to tell SharePoint that the database is a "Project Server" database and attempt to upgrade it via Project Server but no guarantee that will work especially if the databases are from 2007/2010 days.  You would need a Project Server environment to upgrade all the way from the old version to the latest.

Enjoy, 
Chris 

sprocsSchemaVersion must not be null – SharePoint 2016 Upgrade

You may run across this error.  I inadvertently did.  It happens when the content database upgrade process fails expectantly and then the Mount-ContentDatabase won't execute the upgrade again because it thinks it is already upgraded. Unfortunately, this is a catastrophic failure and will require you to restore your Content database and rerun the content database upgrade. 

How did I run into this?  Well, I opened many powershell windows to be "multi-threaded" in my upgrade and the upgrade code didn't like that at all.  It complained about a shared log file and killed off half the threads.  I guess you shouldn't fire that off more than one instance like you could in 2013 and 2010!

Be sure you always backup your databases before you upgrade in case you need to rollback!
Chris 

Hybrid Search Results not displaying

Help!  My Hybrid Search isn't working!  What could it be?!?

  1. So you have successfully run the hybrid scripts here.
  2. You have ensured that the results are flowing to your Cloud Search Service Application via the Cloud SSA crawl logs.
  3. You go to do a search using your Azure AD cloud sync'd account and you get…nothing…what!??!
  4. You look things over and over again…maybe I didn't do this, maybe I didn't do that…no, it all looks good!

The possible cause:

  1. You didn't setup a User Profile Service Application (kinda rare I know)
  2. You view your "sync'd" site collection users profile and notice that the local "email address" does not match your cloud email address…doh!
    1. chris@companyone.com vs chris@companytwo.com
  3. You change the values to match using the edit list item feature of the site users list
  4. You re-run the cloud ssa crawl
  5. Go back to the cloud search center…ta da…results!  Just another reminded that UPNs have to match for the results to process and this value comes from the site collection users list when you have no UPS!

Hope this helps someone!
Chris

Other Helpful posts on a similar topic:

Cuba, Socialism vs Capitalism from a good ole boy – Recap of my trip to Cuba!

It's interesting to me that my last two trips that have been out of the country have been to China and Cuba, two communistsocialist countries.  Being I'm a USA born, good ole boy from Oklahoma, we can just say that it was as eye opening as moving from Oklahoma to Seattle and getting "hit on" the first time by a gay guy, which isn't bad, just different!

My Background: 

A bit of my background…I grew up with almost nothing, worked hard and the wonderful U. S. of A provided and rewarded me based on that hard work.  My mom worked hard and paid for everything from her job and the little child support dad gave us.  From what I remember, food stamps kicked in at one time, so to say I never drank from the US Democratic faucet would be false.  So, what does that have to do with anything?  Cuba.  What an eye opening experience, even more so than China as they seem to be some kind of mix of socialism and capitalism (unlike the pure socialism of Cuba).  As you will see, its not practical for someone to be able to out grow your neighbors salary wise as the average salary in Cuba is right around $40, with developers making somewhere around $500/month.  But the reality is they don't have to pay for their education, or their health care.  Imagine that…you can take as many courses as you want, get as smart as you want…or be as lazy as you want.  If only we could do that in the USA…

Getting there:

Getting to Cuba is very easy.  You can buy a Visa online before you go, or at the point of departure (the actual flight to Cuba) you can buy it.  You also need health insurance, although, being that health care is free, it was a bit odd at this was required. The price of your ticket currently includes $25 health insurance policy and your boarding pass serves as your insurance card.  Be sure that you don't lose the second part of the Visa on your way back!

Cuban Government: 

Most things in China are owned by the government, the hotel we stayed in for instance, the Hotel National, is owned by the government:

 

Although a nice hotel, it is old and run down.  There are other hotels that offer much nicer rooms and amenities, but to have stayed in a hotel that some of the greats have stayed in…we will just say it was an experience you can't compare to anything else!

 

We walked down to the US Embassy which is only about half a mile from the hotel.  Its on the water on the Malcon and its a very nice looking building:

 

Prices and Money: 

There are two currencies in Cuba.  The CUC and the CUP.  The CUC is for tourists, the CUP is for the locals.  You can pay with either if you have it, you just are likely to have CUCs.  I never had an issue with someone paying change in CUP back, but you should be aware of it and this blog is very helpful.  Net net is CUC are pictures of monuments, CUP are pics of individuals.  The price to change out the US dollar was ok at the airport.  The hotel was actually better by about .01.  On the way out, I got back 99% of my USD from my CUC at the airport.

The prices in Cuba are soooo cheap!  Most drinks (Mojito anyone?) are $5CUC at major hotels, if you venture out, you will find them for $2-3CUC and yes, they are super strong wherever you go!

 

Food:

The food in Cuba was amazing!  There was not a single meal that I didn't like!  The prices were soo cheap that I actually would buy 2 meals each time (yeah, I may look skinny, but I eat a lot!). By the way, a massive super yummy dinner for 9 of us at a sit down nice restaurant, was $130CUC.   The one thing you need to realize about sites like TripAdvisor is that the reviews are made by tourists that come off of cruise ships and its a part of their excursion package.  So if you try to go to one of these top 10 places, you will very likely be turned away as they book the entire venue for the excursion folks.  Face control seems to work well, so we were lucky and actually got into some of the places by smiling and laughing a bit…once we the staff realized we weren't with the cruise ships, we got much better attention!

 

People:

The people in Cuba were sooo nice!  You can tell that not having to worry about getting an education, getting treated for a disease or sickness made them very carefree and easy going.  As an aside, they created a vaccine for lung cancer, it makes sense as one of their major exports is tobacco!  I was able to bring back some Cohiba Behikes (52/54/56).  If you don't know what those are, you gotta look em up!

 

InternetNetwork: 

Just know that your hotel wifi will likely not be the same wifi as the public uses and the username/password you get for the RADIUS server won't work out on the open.  I had 45 minutes left on the hotel wifi and could not log into the ETECSA wifi at the airport! 

Socialism: 

So…get this…my presentation was focused on "how to make money" in Cuba. You can find it here.  Needless to say, making money in Cuba as a corporation is not something you will be able to do for quite some time.  Locals are not allowed to start corporations, so as a programmer, you can't start a consulting company and hire people.  The concept of a corporation doesn't really exist in Cuba.  You have to get government approval for everything and as we painfully learned, you also need permits for tech gathering! 

SharePointMicrosoftBusiness:

The people in Cuba do have a computer science based curriculum that teaches C#.  They use Microsoft products.  They don't have the best computers to run things on.  Most computers are imported via family and friends.  So most places aren't going to have some fancy server room where you can run the latest and greatest server OS and server based products.  You also can't count on the network to support cloud services so don't even think about selling cloud services there.  You have to have a local presence.  Someone will need to build a co-lo facility to host all the major players.  I'm sure it will be owned by the Cuban government.  You'll just have to put your servers in it to get any decent bandwidth.  Google is trying harder than all the others, but the progress is slow.  There is a trial to doing broadband.  I did see co-axel cables run all over, and there were lots of CAT5 cables run between houses…a lot of the cables were used for door bells though…LOL.

Soccer: 

Kids are playing soccer everywhere!  It was very hard for me to not actually get out and play (I have a torn ACL right now), but I did manage to get the courage to brave a fully tore ACL in front of a cool church (yeah those are my old Gucci soccer shoes):

 

Random Photos:

McFly!  McFly!  Yeah, there are old cars everywhere and at times the smog was a bit unbearable, this guy seemed to make the best of it:

 

Cubans love their country!  Some cools pics:

 

Get your drugs at a Harry Potter drug store!  This place was deep inside Old Havana in what we would easily call a "run down" part:

 

Our trip home:

Getting home is easy, the airport is pretty fast, although I can see if there were a lot of people it might be a bit crowded at times.  But we had no issues.  The only issue we had was the flight path back home, evidentially we flew over a tornado:

 

Summary: 

Although the major business opportunities in Cuba are in the low teens, and "USCapitalism" like careers non-existent, the environment is fun and the people are amazingly smart and carefree.  They live with what they have and make the best of it.  Something that a majority of Americans need to learn to appreciate.  As I came home and walked through my front door, I felt that appreciation for what I have (and what I don't have and have had), and all the opprotunities I have been given by being "born in the USA".

But at the same time, I felt really sorry for myself in having the wool pulled over my eyes with the stigma and residual of capitalism and how things really shouldn't be based solely on money and success, but on what you can contribute to your community, country and family.  It's what you don't know you don't know…

Go to Cuba before it changes too much!
Chris 

Using Nintex Workflow Cloud and Azure Functions to call on-premises workflows

This is from a lab from the soon to be released Nintex End User Training courses.  Its the first of its kind and I'll simply say, you will learn some fun stuff!  

  1. Login to
    Azure, or create a trial for Azure Functions
    • Trial
      https://functions.azure.com/try
    • Portal
      1. Select
        your subscription
      2. For
        the function name, type “NintexExternalStart”
      3. Select
        a region
      4. Click
        “Create + get started”, this can take a few seconds
  2. Click the
    “New Function” button
  3. Select the
    “HttpTrigger-CSharp” template:
  4. Click
    “Create"
 Copy the following to your azure function:
 
#r "Newtonsoft.Json"
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Security.Cryptography;
using System.Text;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");    
    string url = "https://run.nintex.io/x-start/YourEndPoint";
    string securityKey = "YourSecurityKey";
    var variableValues = new Dictionary<string, object>();
    var service = new ExternalStartApiClient();
    try
    {
        log.Info("C# HTTP trigger function processed a request.");
        var correlationId = service.StartWorkflow(url, securityKey, variableValues, log);
    }
    catch
    {
    }
    string name = "Chris";
    return name == null
        ? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a name on the query string or in the request body")
        : req.CreateResponse(HttpStatusCode.OK, "Hello " + name);
}
    
    public class ExternalStartApiClient
    {
        public RestClient Client { get; internal set; }
        #region Constructors
        public ExternalStartApiClient()
        {
            this.Client = new RestClient();
        }
        public ExternalStartApiClient(RestClient client)
        {
            this.Client = client;
        }
        #endregion
        /// <summary>
        /// Retrieves information about the workflow variables available for the Nintex workflow associated with the
        /// specified endpoint URL.
        /// </summary>
        /// <param name="endpointUrl">The External Start endpoint URL associated with a Nintex workflow.</param>
        /// <returns>A List object that contains a WorkflowVariable object for each available workflow variable.</returns>
        public List<WorkflowVariable> GetWorkflowVariables(string endpointUrl)
        {
            // Send a GET request to the specified endpoint URL. No authorization or authentication is
            // required for this request.
            HttpResponseMessage response = this.Client.Invoke(HttpMethod.Get, endpointUrl, null, null);
            // If a response is returned, check the HTTP status code.
            if (response != null)
            {
                switch (response.StatusCode)
                {
                    case HttpStatusCode.OK:
                        // Success – deserialize and return the list of workflow variables.
                        return this.Client.DeserializeJson<List<WorkflowVariable>>(response);
                    case HttpStatusCode.BadRequest:
                        // Failure – the endpoint URL could not access a workflow.
                        throw new ArgumentException("The request could not be processed.");
                    default:
                        throw new Exception("An unexpected error has occurred.");
                }
            }
            else
            {
                return null;
            }
        }
        /// <summary>
        /// Sends a message to start the Nintex workflow associated with the specified endpoint URL, optionally specifying 
        /// values for workflow variables defined by that workflow
        /// </summary>
        /// <param name="endpointUrl">
        /// The External Start endpoint URL associated with a Nintex workflow.
        /// </param>
        /// <param name="securityKey">
        /// The security key associated with the endpoint URL.
        /// </param>
        /// <param name="workflowVariables">
        /// The workflow variable values to use when starting the workflow.
        /// </param>
        /// <returns>
        /// The <see cref="string"/>.
        /// The X-CorrelationId: {GUID} of the request
        /// </returns>
        public string StartWorkflow(string endpointUrl, string securityKey, Dictionary<string, object> workflowVariables, TraceWriter log)
        {
            // If no workflow variable values are provided, send an empty request body; otherwise, send
            // a serialized JSON object, in which each workflow variable value is represented as a property value.
            string requestBody = "";
            if (workflowVariables != null)
            {
                JObject associationData = new JObject();
                foreach (string key in workflowVariables.Keys)
                {
                    JProperty value = new JProperty(key, workflowVariables[key]);
                    associationData.Add(value);
                }
                requestBody = associationData.ToString();
            }
            // Retrieve and configure the values used to calculate the digest value for the request.
            var path = new Uri(endpointUrl).AbsolutePath.ToLower();
            var httpMethod = HttpMethod.Post.ToString().ToLower();
            var nonce = Guid.NewGuid().ToString();
            var timestamp = DateTime.UtcNow.ToString("O");
            // Calculate and return the digest value for the request.
            var digest = CalculateDigest(securityKey, httpMethod, path, nonce, timestamp, requestBody);
            log.Info(digest);
            log.Info(timestamp);
            log.Info(nonce);
            // Specify the header values for the request.
            var headerValues = new Dictionary<string, string>();
            headerValues.Add("X-Api-Digest", digest);
            headerValues.Add("X-Api-Timestamp", timestamp);
            headerValues.Add("X-Api-Nonce", nonce);
            headerValues.Add("X-Api-Source", "ExternalStart");
            // Send the request to the endpoint URL. 
            HttpResponseMessage response = this.Client.Invoke(HttpMethod.Post, endpointUrl, headerValues,
                new StringContent(requestBody, Encoding.UTF8, "application/json"));
            
            if (response != null)
            {
                log.Info(response.StatusCode.ToString());
                switch (response.StatusCode)
                {
                    case HttpStatusCode.OK:
                        // Success – the message was successfully sent.
                        IEnumerable<string> correlationIds;
                        response.Headers.TryGetValues("X-CorrelationId", out correlationIds);
                        return correlationIds.FirstOrDefault();
                        break;
                    case HttpStatusCode.BadRequest:
                        // Failure – the endpoint URL could not access a workflow.
                        throw new ArgumentException("The request could not be processed.");
                    default:
                        throw new Exception("An unexpected error has occurred.");
                }
            }
            else
            {
                throw new Exception("An unexpected error has occurred.");
            }
        }
        /// <summary>
        /// Calculate the digest value used to authenticate requests for the External Start feature.
        /// </summary>
        /// <param name="securityKey">The security key associated with an External Start endpoint URL.</param>
        /// <param name="httpMethod">The HTTP method name, in lower case.</param>
        /// <param name="path">The absolute URI of the External Start endpoint URL, in lower case.</param>
        /// <param name="nonce">The nonce value.</param>
        /// <param name="timestamp">The date and time of the request, in ISO 8601 format.</param>
        /// <param name="requestBody">The serialized body of the request.</param>
        /// <returns>A keyed hash value, using the SHA-256 function, to be used as the Hash-based Authentication Code (HMAC) value for a request.</returns>
        public string CalculateDigest(string securityKey, string httpMethod, string path, string nonce,
            string timestamp, string requestBody)
        {
            // The data values are concatenated into a single string, in which each data value is delimited by 
            // a colon (:) character, which is then encoded as a UTF-8 byte array.
            var dataBytes = Encoding.UTF8.GetBytes(String.Join(":", httpMethod, path, nonce, timestamp, requestBody));
            // The security key is encoded as a UTF-8 byte array.
            var keyBytes = Encoding.UTF8.GetBytes(securityKey);
            // Using the HMACSHA256 object provided by .NET Framework, the
            // data values are hashed, using the security key, and any dashes are removed.
            using (var hasher = new HMACSHA256(keyBytes))
            {
                var hashBytes = hasher.ComputeHash(dataBytes);
                return BitConverter.ToString(hashBytes).Replace("-", "");
            }

        }
    }
    public class RestClient
    {
        public NetworkCredential Credential { get; set; }
        #region Constructors
        public RestClient()
        {
        }
        public RestClient(NetworkCredential credential)
        {
            this.Credential = credential;
        }
        #endregion
        #region Deserialization
        /// <summary>
        /// Deserialize the body of a specified HTTP response as an instance of a specified type.
        /// </summary>
        /// <typeparam name="T">The type into which to deserialize.</typeparam>
        /// <param name="restResponse">The HTTP response from which to deserialize.</param>
        /// <param name="jsonConverters">If needed, any custom JSON converters with which to deserialize.</param>
        /// <returns>An instance of the specified type, deserialized from the body of the specified HTTP response.</returns>
        internal T DeserializeJson<T>(HttpResponseMessage restResponse, JsonConverter[] jsonConverters = null)
        {
            return this.DeserializeJson<T>(restResponse.Content.ReadAsStringAsync().Result, jsonConverters);
        }
        /// <summary>
        /// Deserialize a specified JSON-encoded string as an instance of a specified type.
        /// </summary>
        /// <typeparam name="T">The type into which to deserialize.</typeparam>
        /// <param name="jsonString">The string from which to deserialize.</param>
        /// <param name="jsonConverters">If needed, any custom JSON converters with which to deserialize.</param>
        /// <returns>An instance of the specified type, deserialized from the body of the specified string.</returns>
        internal T DeserializeJson<T>(string jsonString, JsonConverter[] jsonConverters = null)
        {
            if (jsonConverters != null)
                return JsonConvert.DeserializeObject<T>(jsonString, jsonConverters);
            else
                return JsonConvert.DeserializeObject<T>(jsonString);
        }
        #endregion
        #region Serialization
        /// <summary>
        /// Serialize the specified object as a JSON-encoded string.
        /// </summary>
        /// <param name="value">The object from which to serialize.</param>
        /// <param name="jsonConverters">If needed, any custom JSON converters with which to serialize.</param>
        /// <returns>A JSON-encoded string that contains the serialization of the specified object.</returns>
        internal string SerializeJson(object value, JsonConverter[] jsonConverters = null)
        {
            if (jsonConverters != null)
            {
                return JsonConvert.SerializeObject(value, Formatting.Indented, jsonConverters);
            }
            return JsonConvert.SerializeObject(value, Formatting.Indented);
        }
        #endregion
        #region REST invocation
        /// <summary>
        /// Invokes the specified REST resource, using the specified HTTP method, optionally providing any specified header values and request content.
        /// </summary>
        /// <param name="operationMethod">The HTTP method with which to invoke the REST resource.</param>
        /// <param name="operationUrl">The operation URL with which to invoke the REST resource.</param>
        /// <param name="operationHeaders">A collection of header names and values to include when invoking the REST resource.</param>
        /// <param name="operationContent">The HTTP content to include when invoking the REST resource.</param>
        /// <returns></returns>
        internal HttpResponseMessage Invoke(HttpMethod operationMethod, 
            string operationUrl, 
            Dictionary<string, string> operationHeaders, 
            HttpContent operationContent)
        {
            HttpResponseMessage response = null;
            try
            {
                // Instantiate a new HttpClientHandler object and, if credentials are provided,
                // configure and include them.
                var clientHandler = new HttpClientHandler { PreAuthenticate = true };
                if (this.Credential == null)
                {
                    clientHandler.UseDefaultCredentials = true;
                }
                else
                {
                    clientHandler.Credentials = this.Credential;
                }
                // Instantiate a new HttpRequestMessage, using the specified HTTP method and operation URL,
                // for the request.
    &n
bsp;           var request = new HttpRequestMessage(operationMethod, operationUrl);
                // If header values are provided, add them to the request.
                // NOTE: The implementation is not optimal, but suffices for the sample.
                if (operationHeaders != null)
                {
                    foreach (var key in operationHeaders.Keys)
                    {
                        request.Headers.Add(key, operationHeaders[key]);
                    }
                }
                if (operationContent != null)
                {
                    request.Content = operationContent;
                }
                // Instantiate a new HttpClient, asynchronously invoke the REST resource,
                // and await the result. 
                using (var client = new HttpClient(clientHandler))
                {
                    response = client.SendAsync(request).GetAwaiter().GetResult();
                }
            }
            catch (Exception)
            {
                response = null;
            }
            // Return the resulting HttpResponseMessage.
            return response;
        }
        #endregion
    }
    public enum VariableTypes
    {
        Text,
        TextMultipleLine,
        Number,
        DateTime,
        Boolean
    }
    /// <summary>
    /// Models the information returned for workflow variables by the External Start feature.
    /// This class is used to deserialize that information for the purposes of the sample.
    /// </summary>
    public class WorkflowVariable
    {
        [JsonProperty("Name"), JsonRequired]
        public string Name { get; internal set; }
        [JsonProperty("Type"), JsonRequired]
        public VariableTypes Type { get; internal set; }
        [JsonProperty("Required"), JsonRequired]
        public bool Required { get; internal set; }
    }
 
 And then you'll need a project.json file:
 
 {
  "frameworks": {
    "net46":{
      "dependencies": {
        "Newtonsoft.Json": "8.0.3"
      }
    }
   }
}