Adobe API and How To Utilize It With Workspace Debugger

Matej Veverka, Analytika, 1. 12. 2020


Adobe Analytics Workspace is a great tool for many reasons and we in Optimics use it quite a lot for both ad hoc analysis and reporting. However, it has its limitations, hence sometimes we just need to use our custom solutions for reporting purposes. The reasons for that could be various, e.g. we need to easily blend our web data with information from another data source into one complex dashboard, or our client is just used to make his own reports in another tool (Data Studio, Google sheets etc.). 

In such cases, all we need from Analytics is just to provide us with all its hard collected data. There are a couple of ways you can export your data out of Adobe, but for basic reporting we recommend leveraging the updated Analytics reporting API

Leverage “Enable Debugger” Option Within Workspace

We all know the advantages of using APIs in terms of speed and automatization. The downside is that it usually takes some time to know its structure, get familiar with all the endpoints etc. This is the point, where Workspace comes very handy. Yes, you read it right, the same Workspace we mentioned at the beginning of this article. Don’t worry, it makes more sense than it looks at the first sight. Adobe approach in all its new products from Experience Cloud is to design it “API based”. This means that every change you make within your Workspace project is under the hood just the API call that returns the requested data in the desired structure. 

Sounds like what we need, right? What if we could easily see these background calls and use it for our custom data pipelines? Save your prayers, we already can do that!

If you missed the small enhancement, since February 2020 you can see a new option “Enable Debugger” in the Help menu . This lets you easily build JSON requests for Analytics API 2.0, which is the new version of reporting API. For more details please visit the official documentation.

Long story short, you can just turn on the debugger tool and copy the request structure of your report. Then you can paste it into API client, custom Python code or whenever you need. We will show you the example with screenshots later in this article, but in general, the necessary steps are as follows:

  1. Open the Workspace and create a new (or load existing) project.
  2. Create the freeform table in the desired structure, i.e. in the form you would like to pull the data out. Apply all the necessary segments, time ranges or breakdowns.
  3. Click on the “Enable Debugger” option from the Workspace Help menu and confirm.
  4. After automatic reload, you will now see a small „bug“ icon by your table, which indicates that the debug mode is on.
  5. Click on the debug icon and select the latest timestamp from the list  (this is the timestamp for the last change/API request of this particular table).
  6. The Oberon XML debugger will open up, so scroll down to the XML/JSON request section and copy the text of the request.
  7. Now you can use this request text in your solutions to get the same response as your Workspace freeform table gets.

Let’s Use the Requests

Awesome! As you can see, now you can easily get the Analytics data out just with the Workspace knowledge. Moreover, this is also a great way for learning how the new API works. Experiment a bit and when you are done, you can just turn the debug mode off by clicking “Disable Debugger” option. After confirmation, the page will reload and the debug icon disappears.

This is a nice enhancement, because in the past, in order to launch the Oberon XML debugger, you had to do it by passing a setting code into the browser console and by manually refreshing the page. Or you could leverage the Chrome extension by Optimics to toggle between regimes. The whole process is now simplified and there is native switch directly in the Workspace interface.

Now it is just completely up to you, how you will leverage the data. For inspiration, we have written a simple script which updates the table in Google Sheet every morning, containing number of leads from PPC campaigns, split by months of the current year, so the performance agency can automate its own calculations and related reporting in Data Studio. 

Let us show you yet another use case. Imagine, that every week, you as an online analyst are sending a short report to your manager, describing just the week on week comparison of some KPIs, for example a number of sent lead forms. Manager wants just a pretty straightforward message about what is going on. No interest in receiving and studying scheduled pdf reports, no time to log into Analytics tools etc. Regular SMS, short email summary or even Slack message would be a perfect solution for this. For you, this is quite a routine task, and makes it a  great candidate for automatization using API! Personally, we are fond of Python, so we are using this language for our use case. With Python, we can also easily connect to some other messaging tools like Twilio or Slack and its APIs.

However, firstly, we have to somehow solve the authentication problem. A JSON Web Token client is best if you are creating an application that needs to programmatically authenticate calls to the Adobe Analytics APIs. Service account integration can be set in Adobe developer console (console.adobe.io), where you also enable the Adobe Analytics API:

With this setup and generated credentials and keys, we are ready to continue.

Let us go back into our Workspace project.  All we need is just to create a freeform table with all the necessary data. We would like to compare the number of sent leads for the last week to the leads for the previous week. Simple table with Summary Change scorecard will do the job:

As we can see, the number of leads has decreased a bit (by -0,8%) compared to the previous week. Now it is time for the Debugger to shine. We will use it to get to a JSON request. Let us get started by enabling the Debugger within Help menu:

Now the new debug icon should be available for our freeform table. After clicking the debug icon, we will select the latest freeform table:

Then within the open Oberon XML window for our freeform table, scroll down to JSON Request section and copy the request text to clipboard:

In the meantime, we have just prepared our Python script in the Colab notebook. This notebook consists of a few blocks as shown in the table of contents:

The first one called “Imports” is just about importing the necessary packages and libraries, especially “requests”, “jwt” and “json”.

Authorization part follows, here we will use our client ID, client secret and private key from Adobe developer console.

The key part is our payload definition. Here we will leverage our copied JSON request from Workspace, just a little bit transformed for Python. It could look like this:

payload = „{\r\n    \“rsid\“: \“yourReportSuiteId\“,\r\n    \“globalFilters\“: [\r\n        {\r\n            \“type\“: \“dateRange\“,\r\n            \“dateRange\“: \“2020-10-18T00:00:00.000/2020-10-25T00:00:00.000\“\r\n        }\r\n    ],\r\n    \“metricContainer\“: {\r\n        \“metrics\“: [\r\n            {\r\n                \“columnId\“: \“1\“,\r\n                \“id\“: \“cm3808_5f2177d7a0659d14bf37c55d\“,\r\n                \“filters\“: [\r\n                    \“0\“\r\n                ]\r\n            },\r\n            {\r\n                \“columnId\“: \“2\“,\r\n                \“id\“: \“cm3808_5f2177d7a0659d14bf37c55d\“\r\n            }\r\n        ],\r\n        \“metricFilters\“: [\r\n            {\r\n                \“id\“: \“0\“,\r\n                \“type\“: \“dateRange\“,\r\n                \“dateRange\“: \“2020-10-11T00:00:00.000/2020-10-18T00:00:00.000\“\r\n            }\r\n        ]\r\n    },\r\n    \“dimension\“: \“variables/daterangemonth\“,\r\n    \“settings\“: {\r\n        \“countRepeatInstances\“: true,\r\n        \“limit\“: 400,\r\n        \“page\“: 0,\r\n        \“dimensionSort\“: \“asc\“,\r\n        \“nonesBehavior\“: \“return-nones\“\r\n    },\r\n    \“statistics\“: {\r\n        \“functions\“: [\r\n            \“col-max\“,\r\n            \“col-min\“\r\n        ]\r\n    }\r\n}“

Then all we need is to set the headers and URL properly and make the POST request:

response = requests.request(„POST“, url, data=payload, headers=headers)

With a simple processing, we can digest from the response the data we are looking for. The output could look like this: 

[{‚itemId‘: ‚1200901‘, ‚value‘: ‚Oct 2020‘, ‚data‘: [5409.0, 5367.0]}]

From this data, it is easy to calculate the WoW change in %, which equals:

(5367 / 5409 – 1) * 100 = -0.7764836383804763%.

This change will be used in our description generator. Firstly, we are going to store it rounded and as a string within “diff_text” variable. We can easily parse its sign to define the direction of the change. Based on the sign, the simple if-else logic is applied in order to decide, whether the magnitude of the change is slight, noticeable or significant. At the end of the processing, in addition to the calculated change, “magnitude” and “direction” variables are ready to be used in our final message.

Final message has a pretty simple pattern, altering just by “direction”, “magnitude” and “diff_text” values. The result is providing the manager with the information that “Last week, the number of leads has decreased slightly by -0,8% compared to the previous week”.

That’s it, easy as that. You can further upgrade this simple code and describe also in your final message the most important contributors to the weekly change etc. It is also recommended to download the script and deploy it into some production pipeline which is taking care of scheduling. The automatization task is done and it is just up to you and your manager or supervisor’s preferences, how you would deliver the final message. The integration options with Python are almost endless,so are the other capabilities of how to do even more fancy stuff with all the available libraries, e.g. for NLP. Sky is the limit!

If you are interested in our solutions or you would like to discuss your specific use cases, do not hesitate to contact us.

Have a happy API time!

Co si přečíst dál?

Přidejte se do diskuze!

Napsat komentář

Vaše e-mailová adresa nebude zveřejněna. Vyžadované informace jsou označeny *