Infusionsoft Saved Searches Issue

Hey guys,

I’m pulling multiple reports via Saved searches / Saved reports in Infusionsoft via API to third-party sources. It’s connected via a Dev-key account created (client_id, client_secret) and Oauth authentication done.

Facing a couple errors in various instances:

a) HTTP server returned unexpected status: Unauthorized

Even though the reports are under full access under my user, this error keeps coming and I can even see it inside INFS. The reports have around 3-4K to 10K rows only.

b) Too many consecutive failures. This saved search is blocked temporarily.

After multiple errors of a), it was followed by this error above.

c) HTTP server returned unexpected status: GATEWAY_TIMEOUT

This also comes in certain small, medium and big pulls.

So, I want to understand the following:

  1. why these errors are showing up and how to avoid them in future?
  2. Is there a limit of rows we can only pull via saved searches /saved reports?
    3, What are other some must NOT to do in pulling saved reports?

Kindly let me know your thoughts please.

I’ve experienced this myself and I spoke with some IS developers and found that for a long time the TTL was too short. It doesn’t matter how many rows are returned but rather how complex the query that generates those rows. The more complex the query, the longer it takes and it will fail if it goes back the current TTL. To test this, of course, simplify the search and see if your error messages don’t clear up. I’ve had complex queries that I’ve had to simplify and then do the remainder of filtering on the calling side of things.

1 Like

@John_Borelli

Thanks john!
Any idea how to test out the complexity of these saved searches here OR do you mean number of columns - rows etc also?
Any other workarounds that did the job except the above one?

If you could give a very quick-small example of how you made a complex - simple conversion here that worked, it would be very helpful! :slight_smile:

Have a great week ahead!

Hi @Praxis_Team,

So the number of rows returned isn’t relevant but rather HOW those rows are collected. So if I have 5-6 conditions that generate that information then that is a more complex query. If I am just looking for everyone that has a tag applied and nothing more, then that is a much simpler query and therefor quicker to run. What I would suggest is to make a single condition for your query to start with (and to test things out). If it works then add one more condition etc until you find what would be the ‘threshold’ for your particular search parameters and the time the server allows for the query to run before timing out.