Azure Api Management – User Migration Tooling – Internal User

In Azure API Management users are managed in the Publisher Portal which will one day be deprecated. Till then we have the Azure Portal which is slowly being ported to allow us to manage groups, users and permissions. One element that is not as easy at it may seem. Is the migration of said users from one instance to another. Some API programs have up to 6 instances of API management, migrating test users and onboarded users to various boxes is tedious and doing so manually is not a consideration. 

Here are some things that are to be considered for our scenario:

a) Users are stored in an internal database and not in AAD. 

b) We will not use ARM templates to move the users over

c) We wish to use the underlying API from APIM to access users/groups and more in order to migrate new users and update certain elements as part of the CI/CD pipeline.

Let’s take a look at a solution.
  1. Get all users in the current zone
    1. API Call to :{SubscriptionIDTarget}/resourceGroups/{RessourceGroupNameTarget}/providers/Microsoft.ApiManagement/service/{APIMInstanceNameTarget}/users?api-version=2017-03-01
  2. Get all users in the target zone
    1. API Call to :{SubscriptionIDTarget}/resourceGroups/{RessourceGroupNameTarget}/providers/Microsoft.ApiManagement/service/{APIMInstanceNameTarget}/users?api-version=2017-03-01
  3. Compare and determine if the user is to be moved
    1. Use the Lists returned with 
      Intersect and Except to find out what is in both lists and what is not included
  4. Assign user to a group
    1. Get the users groups with{SubscriptionID}/resourceGroups/{RessourceGroupName}/providers/Microsoft.ApiManagement/service/{APIMInstanceName}/users/{userID}/groups?api-version=2017-03-01
  5. Assign user to a subscription
    1. Use{SubscriptionID}/resourceGroups/{RessourceGroupName}/providers/Microsoft.ApiManagement/service/{APIMInstanceName}/users/{userID}/subscriptions?api-version=2017-03-01
  6. Report on the interaction.
    1. Prepare a report.

This technique has demonstrated at many talks that I do and I have a full project for you if you wish. Let me know and I can send you the source if you would like to use it.

For the CI/CD Pipeline I use the release constructs to fire a unit test which contains the migration tooling. Great for test and dev zones.

Happy coding!



Azure API Management – x509 Policies and Security Constraints

When working with x509 certificates in Azure Api Management. It is possible to accept an x509 certificate from the initial call to identify the client.

This means the POST to Azure Api Management includes the x509 Certificate and in the Policies there should be a validation to ensure that the certificate is present.

Where thins go astray is when we have an x509 Certificate to secure the backend channel. Now we have a possiblitity of two certiifcates.

One to identify the client. One to secure the back end channel.

Great! No issues so far we can use a check to validate the certificate as it comes in and we can attach an x509 Certificate to secure the back end with a one liner in an APIM Policy.

Here is where issues arise!

What are the issue which can present themselves in this scenario?

Unable to update API definition manually

a) When securing the back end channel from APIM, try to update your API Definition from the GUI (Portal) and let me know if you can attach the x509 certificate in order to not have the API complain about a missing certificate before it renders the swagger definition for APIM to consume….

Move two certificates to the API

b) What is your first x509 Certificate is used to identify a particular client and match a database in the API. Now we have to send down 2 x509 Certificates.

Here are fixes to both these issues in APIM:

Unable to update API definition manually

For A where we have issues updating manually due to x509 Certificates not being able to be attached in the Portal. (For that matter it is also not possible to do so in the Developer Portal when you use the Try It! feature as well. So your clients are stuck using unit tests and cannot use the tooling manually)

Move two certificates to the API

<!–relay cert –>
<when condition=”@(context.Request.Certificate != null )”>
<set-header name=”X-APIM-ClientCert” exists-action=”override”>

<!–Send x509 Certifcate to secure back end –>
<authentication-certificate thumbprint=”your guid here” />

Relay will get you the client cert (x509) the client sent and move it in the x-apim-clientcert (custom) header, the authentication-certificate-thumbrint will relay your cert in the x-arr header. You have 2 headers going downstream…ensure you enforce HTTPS.

x-arr is for the apim to api/webapp mutual tls authentication
x-apim-client(or what ever you choose to call it) will be to relay the client cert downstream

Happy Coding!

Azure API Management – Fingerprinting for Reconnaissance and Leaky Headers

The first part of any penetration test or malicious activity is usually reconnaissance. OSINT Tools gather user related information. Http Fingerprinting and tooling like Maltego/Sploitego gather server based elements. At times no tooling is required to identify what management stack the API Platform is utilizing. 

One thing is clear the attack surface needs to be determined. Most professionals do want to fingerprint what server you are running, what IDS/IPS stack is protecting the artifact and in our case what is the management base for the API. 

Here is a typical response from Azure API Management.

Pragma: no-cache
Transfer-Encoding: chunked
Ocp-Apim-Subscription-Key: ******removed by me****
Cache-Control: no-cache
Date: Thu, 01 JAN 2017 12:59:38 GMT
X-AspNet-Version: 4.0.30319
X-Powered-By: Azure API Management -,ASP.NET
Content-Type: application/xml
Expires: -1

Bolded in red are elements of concern.


Why does the consumer need to know the underlying API’s url? This means he now can use tooling to determine the server type and also bypass our management appliance and all the security constructs we may have added to the policies. 


How about the IP of the URL we just gave you. Making things easier to bypass the security and management implemented in the management appliance.

Ocp-Apim-Subscription-Key: ******removed by me****

Here we are sending the subscription key back to the user. Not sure why, he was successful in providing it. Let’s not add more payloads with sensitive data. Needs to be removed.

X-AspNet-Version: 4.0.30319
X-Powered-By: Azure API Management -,ASP.NET
Content-Type: application/xml

Lastly, we have the server and appliance information simply conveyed in a clear manner to the caller. This needs to be information that is not shared in such an easy manner.

A simple alteration to a policy in APIM can omit this data from being sent back to the caller. We can also add security elements at the same time since we are manipulating headers.

2 birds one stone.

  1. Remove elements that can be used to bypass and discover
  2. Augment headers for security reasons.

Lets take a look now at the response:

Pragma: no-cache
Transfer-Encoding: chunked
X-XSS-Protection: 1; mode=block
X-Frame-Options: deny
X-Content-Type-Options: nosniff
Cache-Control: no-cache
Date: Thu, 01 Jan 2017 13:26:58 GMT
Content-Type: application/xml
Expires: -1

Clean and not leaking data. 

TIP: use the policies in APIM to manipulate responses and not leak data about your underlying layer. Especially if you are using PAAS implementations. Where you do not want to have people bypass the appliance (SAA) of APIM Azure Api management. Dont forget to secure your channel with x509 certificates as well.


Azure API Management x509 Certificates Demystified

x509 Certificates are heaven sent. They allow us the capability to do Mutual Authentication, we can secure back end channels, validate clients and in the end they are just a great security construct.

When building a professional API Program one must oversee the capabilities of the API Management appliance (SAAS) to secure the payloads. Having done some work with Azure API Management and x509 certificates I know of the shortcoming and the key features as well as techniques to relay the initial certificate to the back end channel even if that back end channel is also using an x509 certificate.

First and foremost, 

We will oversee the different use cases.

  1. Identifying the client in APIM
  2.  APIM securing a back end channel via an x509 certificate
  3. Both A and B, where as the API will receive both x509 certificates. One to secure the back end channel , and one to identify a client and possibly make decisions based on this.
  4. Developer trying to attach an x509 certificate in the developer portal in order to test and API

Web Application Fortification with ModSecurity over IIS : OWASP ZAP Zed Attack Proxy


Whom will prevail! Two of my favorite tools at hand, one is for offence and one is for defence however we can argue that both are for defence if you analyse it from another point of vue.

With what looks like one of the coolest logos:

The ZAP Attack Proxy is a free tool from OWASP that can act as a proxy intercepting traffic for analysis and also performs scans. Not to mention it can integrate with a large number of other tools.

From their site:

The OWASP Zed Attack Proxy (ZAP) is one of the world’s most popular free security tools and is actively maintained by hundreds of international volunteers*. It can help you automatically find security vulnerabilities in your web applications while you are developing and testing your applications. Its also a great tool for experienced pentesters to use for manual security testing.

Let’s get into the action. I have two sites on IIS , one is being secured by ModSecurity and the other isn’t. Let’s see the variances.

Port 80 is secured and 2016 has just seen a run.

When firing the execution on hte port 80 rendition we get a 403 which is what we want. Not long ago a consultant came in and advised us to use this tool , which I also advise! However I said what do you do if we have an IDS that knows this signature and stops the traffic instantly? She had not seen such a thing before usually seeing the tooling continue on with the scan. Now using this tool as  aproxy to intercept is a different ball game.

Take away, how many different scanners have you tested against your site? Do you stop them instantly? Do you have forensics telling you someone at x IP address is constantly scanning?

Food for thought!

Happy Defending!

Web Application Fortification with ModSecurity over IIS : HTTrack Website Copier

Excessive recursion is the number one problem plaguing modern web applications and API’s. I always use the analogy of the bank where the client continues to go to the teller and try credentials in order to have his card authenticated. One of the elements I like to utilise is the module that Mads did in 2007!

What 2007!, yes 2007! works wonders for customers who refuse to add Modsecurity or SNORT IDS or to have any appliances. Then whether on classic webforms / mvc / API’s we integrate this module and I can tweak it to allow only enough traffic that can mimic a human user.

ModSecurity over IIS is excellent when dealing with excessive recursion. I have seen it stop the OWASP ZAP Zed Attack Proxy in its tracks, stop Brutus from cycling its usual credential attacks, SQLMap from trying to pull databases from vulnerable SQLi sites. One element where it allowed the traffic to go through was with the HTTrack Website Copier.

What is the HTTrack Website Copier. From their site:

HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility.

It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site’s relative link-structure. Simply open a page of the “mirrored” website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.

I first utilised this when our product Manager was going to a remote part of Africa where the internet was to be scarce. I downloaded the utility and presto I had a nice offline rendition of the site.

Then it dawned on my, isn’t this a nice reconnaissance tool for sites that perhaps do not have alot of forensics and could allow us to download alot of elements and then we can do some analysis offline.

Evidently not all can be pulled and observing something live is better but we dont register hits on things that are local.

Notice that ModSecurity and IIS allow me to fire excessive recursions against the site. I had to add custom rules in order to halt the attack. 

Most likely no one is testing against it!

Happy defending!


Swashbuckle and the Swagger UI : Curious Case of the Invalid Data Types

Swashbuckle is a nice Nuget Package that allows for easy integration with Swagger and is qualified as  one of the defacto REST API discovery tools. This along side ASP.Net’s Web Api HELP page which can showcase a service quite nicely.

One of the falicies that I have come to observe in the swashbuckle rendition is that when you generate a sample from the REST API in what is call the Swagger UI at times you have a mismatch on the data types. Now Swagger can generate a document which is JSON with the /document/v1 payload and this is not affected as you will see in the example. However, to bypass the data type issue I had to add my own rendition of the index.html for the Swagger.UI and augment some Javascript. 

Here is the issue at hand with a tutorial on how to replicate and a nice discovery on how differnt data types are rendered.

First and foremost , we set the stage.

From Visual Studio create a new Web Application.

Select Web Api and follow the screenshot for other indicators.

The following is generated.


Add a class with various data types. Types that are not common through all platforms should be utilised. Nullables of double and decimal come to mind.

Add the NuGet Package for Swashbuckle

Oversee the changes

Notice the addition of this file. We can add a maptype<> in here to tell swashbuckle what to do with a double and how to convert it but it has no effect on the Swagger.UI part. Only the JSON from /documents/v1 as an example.

Fire up the service and oversee what you get with the default tooling.

Select the API which returns your class

Notice the json the douvles have a (.) DOT and so do the decimal types.

  "datetimeVal": "2017-05-10T15:08:06.1547306-04:00",
  "stringVal": "sample string 2",
  "integerVal": 3,
  "nullIntVal": 1,
  "doubleVal": 4.1,
  "nullDoubleVal": 1.1,
  "decimalVal": 5.0,
  "nullDecimalVal": 1.0

Now fire up swagger with /swagger



 Observe what is generated, notice no (.) DOT

Copy and Paste that into a classic client generator.

Also do a Java implementation as well, same issue

The docs/v1 is ok however!

The data types are format double and type number which is what we want. 


This said careful when generating client classes as this will have casting repercussions. 

Happy Coding!




ASP.Net Web API and ModSecurity over IIS

ModSecurity is a great tool and a great compliment to IIS. The best thing is that it can secure all site , some sites, and regardless of what you want to secure as long as you can run the HTTPModule you can secure the inbound and outbound payloads.

From their site:

What Can ModSecurity Do?

ModSecurity is a toolkit for real-time web application monitoring, logging, and access control. I like to think about it as an enabler: there are no hard rules telling you what to do; instead, it is up to you to choose your own path through the available features. That’s why the title of this section asks what ModSecurity can do, not what it does.

In order to install Modsecurity, head over to to get the latest installer. :

ModSecurity: Open Source Web Application Firewall
Here are the install steps and the discovery and startup of your first site on premise and in the cloud.
First and foremost – use the double click! It’s what us Devs do best!
Away we go…

There are 64 and 32 bit renditions and a repository for the OWASP CRS which stands for Core Rule Set. Which you want.

Next is the ability to configure the instance you will want to say yes unless you are doing more of a silent install or want to powershell these permissions/additions …otherwise select the box and move along.

We are now complete , finish and go explore.

This said, first thing to oversee is IIS itself.

Notice the addition of 2 new HTTPModules:

Excellent, now off to the root! Which should reside at:

 C:\Program Files\ModSecurity IIS

Peruse the files and concentrate on .conf.

Then for the site you want enabled use this in your web.config:

<ModSecurity enabled=”true”
configFile=”C:\Program Files\ModSecurity IIS\modsecurity_iis.conf” />

<!–<remove name=”ModSecurity IIS” />–>

<add name=”ModSecurity IIS (64bits)” preCondition=”bitness64″ />



Away you, go…in my next post I will be attacking a localhost site with various tools to see how ModSecurity and IIS react.

Happy Defence!

Microsoft Dynamics for DotNet Developers

With the somewhat recent announcement that Dynamics is going to be the CRM of choice at the GOC. We are announcing  a presentation on Mycrosoft Dynamics for .Net Developers. When we discussed doing a series to start up a study group the masses wanted B.A, FUnctinal and testing focused areas, however being that our user group is more technical in nature we will be concentrating on the .Net side of things with a lot of examples coming from either ASP.Net API or other elements.

Here is the event:

Microsoft Dynamics Certification Study Group Planning

Tuesday, Feb 14, 2017, 12:00 PM

Microsoft Canada Co. (Ottawa)
100 Queen Street, Suite 500 Ottawa, ON

26 IT Community Members Went

Planning and orchestration of a new study group for a series of MS Dynamics certification exams.We are pleased to announce that we will work as a group to create a new Study Group for Microsoft Dynamics exams.Previously we have had success with MCAD and MCSD study groups and we wish to continue with this new series.Planning:• Which exams• Exam…

Check out this Meetup →


Azure Lunch and Learn: Azure Api Management Showcase

I have great news to share with the community. I was able to secure a room for 12 engagments in order to go forth with an Azure MOnthly Series.

As an ASP.Net and ASP.Net WebAPi specialist I will be doing the demos around these constructs. 

The first is on Azure API Management where we will also see renditions of MuleSoft and APIGEE for API Managment. For the first lunch and learn we will concentrate on Modeling ASP.Net WebAPI’s and creating ASP.Net Web API’s. Once this base is completed we will continue and aggregate the API with Api Management. I would say the session will be 80% Web API and 20% Azure API Management.

See you there:

Azure Lunch and Learn: Azure Api Management Showcase

Tuesday, Feb 28, 2017, 12:00 PM

Microsoft Canada Co. (Ottawa)
100 Queen Street, Suite 500 Ottawa, ON

41 IT Community Members Went

How:This first segment in the Azure Lunch and Learn Series will focus on API Management and this in a nutshell. The goal is to oversee a product and its intrinsics but just enough over a lunch hour for you to take away key concepts and to start guiding your research. We are going to offer this series once a month for you to come in and learn Azure…

Check out this Meetup →