I notice that the Log events endpoint involves a PUT rather than a GET. However, the description of Log events is “Get a history of events for a contact in a specific workflow.” This sounds like no data gets changed. Why is a GET not used for this, and does stored data get changed by a Log events PUT?
@seb_fairchild any comments on this? I’d like to make that endpoint into a table, but if data is being changed I won’t be able to. My guess is that the endpoint uses PUT because it passes filter information along as a JSON payload, but I want to make sure nothing is being changed.
Can someone please post an example of how to structure the PUT request? In particular, I’m struggling with passing the filters into the URL.
How are you making the request? I recommend Postman, it makes things much easier. Just paste some of the example filter JSON into the request body, and use the URL structure they specify on the docs page.
Hey @Adam, thanks for your reply! I’ve been using Postman, but with little success. Here’s an example of a URL that I’m using:
Also, I get a “405 Method Not Allowed” message when this is run as a PUT request. However, it return JSON when executed as a GET. Types other than “COMPLETED_WORKFLOW” are returned, though.
Try putting the filtering stuff in the request body. Like this:
You’ll also need to append /filter to the end of the url. Like https://api.hubapi.com/automation/v3/logevents/workflows/1234567890/filter?hapikey=xxxxx
If the docs are followed (i.e. we PUT, a JSON object for filtering, and /filter at the end of the URL), I can get filter results. However, if I make a request like @Mark_Hensley (i.e. without the /filter and using GET instead of PUT) I get all the data back (seems like no filtering is happening). I don’t see any mention of a GET in the docs. Is the GET version of the API available for public use? If so, it would be helpful if it was mentioned in the docs.
Thanks for your help. I am now returning filtered responses.
One more question - each call seems to return a max 50 objects. I assume that the “limit” and “offset” elements can be used to return the full data set (using a series of calls perhaps?) Any advice on how to accomplish this?
From what I can tell, limit and offset specify a date range. Any event that occurred within this range (that also matches your filters) will be returned. limitDate is the start of the range and offset is the end. Data ranges are in UNIX timestamp format, so you’ll probably need to convert the date range you’ve got in your head to UNIX before you set a range.
Regardless of the date range I use, I get a maximum of 50 objects returned. Anyone else experience this and if so, how do you use these timestamps to loop through the rest of the events?
You don’t need to loop, just specify a large enough range. Try this: limitDate=1&offset=2517910400000
That’s from 1970AD-81759AD. If you still don’t get more than 50 results back with that it’s because there’s on 50 results in the entire system.
I tried using the limitDate & offset to get all events out of workflows that have well over 50 events, but I was still only getting 50 per call. I reached out to support and got the following reply regarding the limitDate & offset parameters:
Currently, there are two optional parameters that can be used in the “Log Events” endpoint: “Offset” and “Limit Date.” By default, each response will return 50 logs, but you can use the “Offset” parameter to paginate through the logs.
Paginating through the logs is going to be a little tricky without the “has more” indicator that the other API’s seem to have. Anyone have any ideas on how to accomplish this? Really wish the documentation was a little more accurate and helpful in this area.
You’re right, that is really frustrating. I see what you mean now. One way to page through would be to set your offset to some ridiculously late date (like 2517910400000). Start your initial limit at 1. The 50 results returned will be in order sorted by time (i.e. by ‘time’ parameter, at least it appears this way). Choose the largest of these values to be your next offset. Repeat this process until < 50 results are returned.