A year of marketing at Streamdata.io

It’ll soon be a year that Im working at Streamdata.io, and I want to take a moment to talk about what Ive done for the company over those 12 months.

There are so many things I wish I could do in a day that I tend to focus on what is still to be done instead of on what has been achieved. This kind of behavior is probably not specific to me. In staircases, Im always obsessed by the steps to come, not by those already climbed.

As far as marketing is concerned, Streamdata.io was almost starting from scratch when I joined. A few actions were taken that were marketing actions, but out of any global strategy. I don’t blame them: at some point you need something and you do it. Because you can. Because it’s easy to do. But I think it’s fair to say that things were a little messy.

What is Streamdata.io?

I started with the basics: understanding the problem we solve. If you’re not familiar with APIs, let’s say that whatever you do as a human on a website can/could be done programmatically through an API. You send a request to an API, and the API sends you an answer.  You’re then considered a client (of the API).

In some situation, it’s important for you to stay in sync with the data available through the API. Let’s say you look at stock market prices through an API. By the time the API sends you the price after receiving your request, this price has probably already changed, so you need to send another request to the API, and so on, and so on.

This situation has two major drawbacks:

  • for people interested in the data, despite their compulsive data requests, they never know if the data they have is fully reliable.
  • for people providing the data and operating the API, the challenge is different: the more successful the API and the more requested the data, the harder it gets to scale and be able to handle the traffic.

To have an idea of the issue, studies show that 98,5% of API calls retrieve data the client already has. Clients waste a lot of energy trying to stay in sync and API providers have to have an infrastructure able to handle all this useless traffic, or enforce API call limits (which increases the sync issue for the clients).

And so here’s Streamdata.io. Streamdata.io provides a proxy. It means that our service is going to be located between the client and the API. We provide two main features:

  • a push technology: what we call streaming. Basically instead of having to request the data over and over again, you “subscribe” to it once and then the data is pushed to you (“streamed”). You no longer request it. This is the view from the client side: no effort to get the data, it comes to you.
    For the API provider, the value is that all the clients getting the data through Streamdata.io won’t poll the API individually any longer. Our service is the only one directly requesting the data and then pushing it to all subscribers/clients. The load is reduced, making scaling easier.
  • differential computing: the data provided by an API usually has several fields that are all sent over every time through the request/response mechanism. For example if you are interested in the bitcoin price, the API might send you several information like the price, but also the highest price of the week and of the month and the lowest for the week and for the month, so that would be five fields. If the price might change often, the other fields might not change so often. Streamdata.io computes the difference between one API call and the next, and then just sends what has changed to the clients instead of the whole data set. In the previous example, you’d never receive more than three fields instead of five every time. Depending on how the API is built and the kind of requested, Streamdata.io can save up to 90% bandwidth.

With those two features, we provide a solution where clients receive small patches of data in real-time (i.e. as soon as new data is available on the API). The service can be used by API providers to stream data to their clients, or can be used by clients to get access to a streaming version of any API.


From the previous paragraph, you can deduce two of our targets: API providers and data consumers (clients). Let’s be a little more specific.

Not all API providers are targets. APIs can be used for a lot of things, but the only providers having a fit with our solution are data providers.

On the other side, there are two kinds of data consumers we discovered we could target.

  • The most obvious ones: companies building apps for humans relying on API data. You operate an investment app, and you want to display to your owns customers stock prices in real-time in your app. These prices are provided to you through an API by a data providers. Our solution allows you to display those prices in real-time, enhancing user experience compared to having to reload the page.
  • Data scientists: machine-learning softwares and predictive models needs to be fed with data. With the differential computing, data scientists can consume API data as events instead of raw data, saving them precious time deduplicating it.

Streamdata.io had mainly been targeting developers as a whole, with a lot of technical content. My assumption was that targeting technical profiles wasn’t enough. We needed to reach out to business-oriented profiles like product or revenue teams.

In the end, we would have 5 targets:

  • API Provider – Technical Profile
  • API Provider – Business Profile
  • Data Consumer – Technical Profile
  • Data Consumer – Business Profile
  • Data Scientist


How you present yourself is important, even more when you don’t sell something people can see or hold physically. How you present yourself is how people will evaluate how much trust they can put in you at first. And without being obsessed with perfection, as we wanted to start addressing business profiles, something had to be done in order to be seen as serious players.

I built several decks that were improved over time. I wanted something light and clean to introduce the product without giving the feeling it is complex, which is what you get with dark themes.

It was also important for me to follow some sort of logic from one slide to the other, starting by the context, the problem we identified, the solution we built to solve it and its different applications and benefits.

This is what we had and what we ended up with:

I knew we would get deeper into branding at some point, typically as we’d be building a new website. Building this presentation was at first some sort of quick-win, but it seems like we got used to it over time.


When I joined, Streamdata.io had three pricing options:

  • a free plan
  • a self-service consumption based tier available on the SaaS
  • a last tier for which you’d need to get in touch with us, for huge SaaS clients or clients interested in a managed or on-premises version of our service.

My immediate feeling was that pricing needed to be changed. The free plan was alright to me, even better than a free trial, limited in time. But then, what our pricing was saying to customers was basically “Don’t try to talk to us unless you spend at least 1k/month on our SaaS” because then you’d get into custom plans.

The second issue was that our customers would discover their bill only at the end of the month. If you remember the time phone communications were charged this way, you probably remember your mum always asking you to hang up the phone because you were going to ruin the family ( I suddenly remember there used to be free local calls in the US in the 80s I think. It wasn’t the case in France.).

We decided to go for flat rates, where you would pay at the beginning of the month and have fix limits. We kept the monthly billing rythm, and added a discounted yearly option to our plans. We also decided to create two flat-rate tiers out of the middle consumption-based one:  the first for 100$/month and the second for 500$/month. This also means that we created an opportunity to discuss sooner with our biggest customers, for custom plans.

After noticing that a lot of clients were spending time with our support team for integration, and were really appreciating the quality of their inputs and feedbacks, we decided to create a premium support plan, included in our premium plan and purchasable as an upgrade with our two newly created tiers.

Websites and CRM

Streamdata.io has a website and a developer portal for the SaaS. When I joined, there was also SalesForce and Pardot. And we have google email accounts.

On the website, you could register to a newsletter not managed through Pardot (and not even sent at least once a quarter) or contact us through a form which would end up as an email in a shared email account. There was also our sales email address available on some pages of the website.

On the SaaS, we had our users, with data about them. A lot of data. Very interesting data. Shared everyday at 12am by email as csv files (one of them was more than 500Mo and more than a million lines).

In SalesForce, we had people that the team met at some point in conference, events or meetings. They kept business cards and my boss’s son entered the data in SalesForce, without any more context: just name, email, company, job.

Information from calls or meeting with our clients or prospects like projects follow-up and such were mainly kept and shared as text documents in Google drive.

None of these elements were connected together. No unified source of information.

While our technical team took care of exporting our portal data to SalesForce directly, I worked on connecting our website to SalesForce and to Pardot, took some time to sync our portal data to Pardot through SalesForce, took some time to clean our database, took some time to build the newsletter from Pardot and get rid of the old tool.

You guess it: it took me a lot of time.

I wouldn’t say it’s all been fun, because there’s a lo(ooo)ng learning curve and I was starting from scratch. I knew what could be done with a CRM like SalesForce or with marketing automation tools like Pardot, but never before had the opportunity (is it really one? :P) to do it all by myself. So it’s long, and you make mistakes, but to be honest, once you’re done getting the basics and feel comfortable with both tools, it’s really great to see what you can get from them.

I was finally having a unified vision of our prospects and users, so I could start identifying patterns and taking data-driven actions to push them if the right direction.

But let me step back: I said I connected SalesForce and Pardot to our website, but something happened before: I also took time 🙂 to write the specification for a new website. Too many different features built over time by different teams made this old website a mess to manage. It was time to build a new one, with our new positioning, our new sales pitch, and all this CRM magic stuff integration. I pitched several agencies, picked one and have been their main contact. It took us almost 6 months from briefing to a live website.

Im not sure today that anyone apart from developers at our agency understood what we do. And I obviously haven’t been good enough at explaining it (tho when you don’t work in your native language, it’s sometimes hard to decide between” is it me not being good at explaining in a foreign language?” and “is is them just not getting it at all????”). As a consequence, Ive written a lof of the static content on the actual website, including a few bad puns. And even if Ive been happy to correct a few mistakes made by native english writers, Im still wondering if I haven’t added more than I corrected in the end.

Here’s what we had and where we got:


Lead acquisition and nurturing

Once you have a unified vision of your world ( CRM), a robust communication structure ( website) and a SaaS to sell, your to-do is pretty straightforward:

  • increase traffic by creating great content and spreading it
  • capture leads
  • nurture them

I was concerned about the content part, because our area is pretty technical ( read: too technical for me to write about), but it turns out we’ve been lucky enough to be able to work with an expert writer in our domain. Im not talking about a graduated kid in marketing doing researches online to write a few blog posts. Im talking about a real expert, gifted for writing code and writing content for the web and bringing a lot of inputs project-wise. His name is Kin Lane, he’s even more famous as APIEvangelist and somehow, he’s a machine. Let me tell you: don’t let anyone without a passion for something write on behalf of you about this very something. That makes boring content. Everybody hate boring content. Kin writes stories. People like stories.

Spreading the content is a little harder, as everybody is fighting for attention, and there is already a lot of great content available everywhere. I did a lot of SEO on our articles in order to increase our ranking and natural traffic. I also started experimenting syndications for our content, and doing some adwords and retargeting; mainly to experiment and learn on small budgets. So far, we’ve noticed a small increase in traffic so we’re getting the feeling that all of this pays off, but with only 2 months behind us since the release of the new website and 1 month only of frequent blog posting, it’s hard to make any conclusion yet.

Capturing leads is done of course through Pardot forms, mainly when downloading our white papers. We also consider users of our free plan as leads.

All of these people are nurtured based on their behavior. Ive build several automated campaigns that are triggered by what people are doing or how they’re engaging with our digital assets (portal, website). The scoring system associated with these behaviors allows us to have a clear view of our warmer prospects so that the sales team can then follow up directly to engage with them.

All those news things put together gives a good idea of the journey I went through this year. Once again, it’s a little early to evaluate the real impact of what I did on the business aspects, and a lot of things are perfectible. But we built what Id call a “minimum viable marketing machine” and we’re already learning from all the data we are now collecting.

One of the next step is going to be the laucnh of a new pricing tier, because we identified that our first paid plan might be a little expensive as a first step after the free plan. We’re going to have an intermediate version with which we will target people hiting the limits of the free plan but who have never upgraded so far.

We also start having a better understanding of our customers journey, not just with us, but through the API landscape as a whole. We discover that while not everybody is ready for what Streamdata.io does (i.e streaming APIs), we have more and more requests for doing consulting due to our expertise. and help people and companies understand how to do APIs correctly, and we’re happy to do it as it’s at the same time a new line of business and an opportunity to engage with potential future customers of our core service.

A year ago today was my last day at SoftBank Robotics. Ive learned a lot this year. And I hope my work will help make a difference in the future for Streamdata.io.

Leave a Reply

Your email address will not be published. Required fields are marked *