Skip navigation
All Places > DevNet > Spark for Developers > Blog
1 2 3 Previous Next

Spark for Developers

44 posts

Since we introduced Administration APIs this past October, you have asked how you can build and test with these APIs without being an administrator in your organization. To help with this, we have created a Administrator Sandbox organization that we can make you an administrator of. Then, using this new organization, you can develop and test your integration.


There’s a few rules and things you need to know about using this Sandbox organization, so read the documentation for details.

Last week, we introduced a bot that allows you to play the classic text adventure game Zork in Cisco Spark rooms or over SMS through Tropo.


Here’s how the bot works (links to the source code repo are at the end of this post).


First, a little history. Back in the 1980s, Infocom created a virtual machine called a Z-machine that Zork would run on. Writing new games was a matter of creating a game file that the virtual machine could read. This made porting the game to various computer platforms simpler, as they’d only have to port the virtual machine, not the entire library of games.


Once Infocom closed down and had their assets absorbed into Activision, game enthusiasts reverse engineered the Z-machine internals by studying the game files and ultimately created replacement Z-machines. One of these, Frotz, is the Z-machine driving this bot. Eventually, Infocom’s owner made Zork 1-3 available for free download, and the Zork 1 game file is installed in the bot.


Frotz runs as a Linux command-line process, and has a mode where you can pipe game commands on stdin and get the results on stdout. The zmachine-api project, created by some engineers at OpenDNS, uses node.js to wrap a REST API around this project and manages starting and stopping the Frotz processes as needed. The Spark bot uses this API.


The Spark bot relies on two Webhooks. One on message creation, is set up without a roomId filter, so all messages that @mention the bot or are in a direct conversation with the bot are delivered to this webhook. The second is a memberships created webhook, also without any filters, so that the bot is notified any time it is added to a Spark room.


When the bot is notified by the Memberships webhook that it’s been added to a room, it immediately posts to the room, introducing itself. It explains what it is, and reminds room participants that they need to @mention the bot if they want to play. This small on boarding mechanism can help room participants understand why a bot was just added to a room and what it can do. This exact on boarding method isn’t appropriate for all bots, but when you are creating a bot, you should think about what a bot should do when it first becomes alive in a room.


Each Spark room is a separate instance of the game. Move around the Zork world in one room, and then go to a different room, and you’ll find your Zork game there is still in the same place it was when you left it.


Another bit of Membership webhook magic is used when the bot is added to a room that already has a game running. If you started playing, remove the bot, and then add it back in again, the game should pick up right where you left off. But the bot should still introduce itself, and also remind the players where they are in the game. Zork has a command called “look” that causes it to tell you what your current surroundings are. So when the bot is added to a room and sees there’s already a game for that room ID, it issues the Look command and shows the players what is going on.


The messages Webhook is how the bot gets the commands from players in the game. When the bot is mentioned, or when someone speaks to it in a one-on-one room, the bot takes the exact input from the player and sends it to Zork.  The Webhook triggers the bot, the bot uses the message ID from the webhook to fetch the message content, and then that message content is sent to the game.


One thing that isn’t immediately obvious when creating a Spark bot is how to strip the bot’s name from messages when it’s mentioned. For many bots, this can be as simple as a string match on the bot name and strip that string out. But because this bot has a two word name (“Text Adventure”), the mechanics of Spark mentions means that other people in the room could affect how the mention appears in a message. Spark puts the person’s first name in the message when you mention them, unless there are several people in the room with the same first name. Then to avoid ambiguity, Spark puts the full name in. For a bot like Text Adventure, the presence of another bot in the room called “Text Mom” could mean that sometimes the @mentioned name is “Text” and sometimes it’s “Text Adventure”.


To handle this issue, when the bot server first starts, the bot gets /people/me to find out who it is. It then stores its own personID and uses that to remove mentions by running a regular expression on the HTML markup that wraps a mention.

if (message.html) {
  // strip the mention & HTML from the message
  var pattern = new RegExp(']*data-object-id="' + sparkbotself + '"[^>]*>[^');
  action = message.html.replace(pattern,'');
  action = action.replace(/]*>/g,'');
} else {
  action = message.text;

Another thing the bot needs to do when processing the message webhook is ignore its own messages. A bot’s message to a room will trigger the webhook from Spark, and you don’t want the bot getting caught in a loop where it is answering its own message, with each answer triggering a new message. When the message webhook arrives, the bot compares the personId that sent the message to its own personId and ignores it if they match.


To keep different instances of the game separate, zmachine-api uses a “label” to tag a Frotz process and later find the process ID that a command should be sent to.  Each label becomes a different instance of the game. The bot uses the Spark room ID as the label, regardless if you’re playing in a direct or group room.


Because we’re starting lots of games, in order to keep from having thousands of idle Frotz processes running, the bot starts a new process for every in-game command, then saves the game and shuts down the process when the command is completed. This also ensures that if the server is restarted, all of the games can pick up where they left off. This has the side effect of the “moves” counter in the game appearing to increment by two on every in-game move, since Zork considers saving the game to be a move.


Here’s a flowchart of the bot’s logic. Spark-specific bits are in green, Tropo-specific operations for SMS are in blue, and interactions with the Zork engine and zmachine-api are in orange.


zbot flow.png

And finally, the code behind the Zork bot is available for your own use. Grab it from my GitHub repository. It’s all written for Node.js, using the Express framework. Included in the repository is a Dockerfile for spinning up a container with the game, and a docker-compose example that will start up multiple containers with the game and all its dependencies.

You can now play Zork in Cisco Spark. Add to a Spark room, or start a direct conversation with it, and you’ll be invited to play the classic text adventure game Zork: The Great Underground Empire - Part I.


Each room you add the bot to is a different game of Zork. All the participants of the room can play the game together.


The bot also has SMS support through Tropo. If you’re in the US, you can play Zork over text message by texting something to +1-844-373-9675. Outside the US, try +1-541-936-9675, but this has a lower capacity and may not work as well during busy times. In either case, your carrier's normal text or data rates apply.

Next week, we’ll explain how the bot works, show some code, and walk through some of the helpful user interface things you can do in your own bot.

Earlier this week, a new feature was added to Spark to tell you if a person is available or not. Based on your usage, we determine if you're actively using Spark and then in the client, help people understand if you're online or not by saying something like "Active 10 minutes ago" next to your name.


Now the Spark APIs have this same information available to you. The People API has two new fields, lastActivity and status that help your integration or bot determine if a Person is active or not. The lastActivity field gives you the date and time that Spark last saw the person doing something - writing a message or having a call, for example. The status field tells you if the person is "active" or "inactive".


These fields are read-only; there's no way to set a person's status through the API. Spark determines their status automatically, with no input needed from your side.


In the future, as Spark gets more presence options, the status field will gain those same options. So while you'll only see "active" and "inactive" today, you'll want to make sure your application doesn't break or behave oddly if you get a different status entirely.

We have had a pretty busy couple of months around here at Spark for Devs. We've been spreading the good word about Cisco Spark to events around the world. I just wanted to take a minute here in this blog post to wrap up what we've been up to lately.


October 27 - TechCrunch Meetup in Seoul

TechCrunch held a very successful meetup in Seoul, South Korea, where many of the attendees were involved in a pitch-off. The winner of the pitch-off was the AI-powered scheduling assistant, Konolabs. You can click the link above to see a video with highlights and here are a couple of pictures from the night...

tc seoul 2.png

November 7 - Cisco Live in Cancun

Cisco Live is always a good time, and especially when it's in beautiful Cancun. People from all over Latin America gathered in Cancun to get an taste of what Cisco is up to and get hands-on with Cisco Spark and Tropo. Here are a few pictures from that event.


cl cancun 3.png


November 14 - LAUNCH Scale in SF

Back in San Francisco, California, USA, startup founders gathered to put their ideas to the test and learn from the best at LAUNCH Scale. We were there to hear the speakers and let everyone know how great Spark can be for a growing company.


November 15 - TADSummit in Lisbon

In Lisbon, telcom app developers gathered around to show off their stuff and exchange ideas with the best and brightest that the telcom industry has to offer. Proving that telcom isn't a tired, old industry, TADHack also hosted hackers putting together full apps in under two days. Cisco Spark and Tropo were there sponsor, learn, and help. Click on the link above to watch some video, see some pictures, and read a full write-up of the event.


November 17 - Spark Open House in NYC

In NYC, customers were invited to attend the first Spark Open House event. This one-day event included eight Cisco Spark stations covering Cisco Spark Message, Meet, Call, and APIs. The Spark APIs station covered integrations, bots, and open source development. Cisco Spark Depot partners Status Hero, Talos Digital, and Cumberland Group joined us to showcase the integrations and bots they have created. Connecting customers with our partners is our mission!




November 20 - Geektime TechFest in Tel Aviv

If you're at all interested in tech and were in Isreal on November 20th, this is exactly where you wanted to be! On top of a conference and a hackathon, Geektime TechFest also hosted a startup competition that boasted some pretty fabulous ideas including the winner of the competition, Imperson, which is a chat bot that promotes and sells products and services. There were so many fun times and too many pictures to pick just a few to post here. Check out their Facebook Page to see a bunch of fabulous moments from the event!




November 22 - Chatbot Summit in Tel Aviv

Speaking of chatbots, Chatbot Summit featured some of the greatest minds in the chatbot world who came together to talk, teach, and learn about the future of automated messaging. Taking place as part of Geektime TechFest, it was a simple to go back and forth and include Chatbots in the realm of technology as a whole. Check the Facebook page linked above for more info and pictures of the events!


November 25 - Codemotion in Milan

Who doesn't enjoy food and wine in Italy? Many tech evangelists presented at Codemotion ranging from IoT to Big Data. Jason Goecke opened up the event with an inspirational presentation focusing on developers and introducing attendees to Cisco DevNet. Steve Sfartz and Angelo Fienga dug into the interaction between IoT and humans.


codemotion 2.png

With the recent launch of the Cisco Spark Depot, the Cisco Spark Depot replaces the Cisco Spark for web integrations. So you will no longer see the Add Integrations option in Cisco Spark for Web. In each room, you will now see a link to the Cisco Spark Depot. Check out the Depot for all the latest integrations and bots for the Cisco Spark app.


The Cisco Spark Depot includes the same integrations that you used to access using Cisco Spark for Web, and more. Integrations that you've already set up using Cisco Spark for Web will continue sending you notifications as configured until November 30, 2016.


On November 30, 2016, we will ramp down those integrations and they will stop sending notifications. We suggest you to replace your existing integrations with new integrations from the Depot immediately after November 30th. It takes only a few minutes. If you enable the replacement integrations before November 30th, you may get duplicate notifications: one from the Depot integration and another one from the integration that was set up using Cisco Spark for Web. Incoming and outgoing web hooks will be available soon from the Depot.


Visit the Cisco Spark Depot today!



Last month, we announced a new set of resources to administer Cisco Spark. Thanks to the new administration REST API , you can now not only add Users to an Organization, but also modify Users’ Roles or check Licenses.


One obvious point is that you need to be a Spark Administrator to use the new API. Simply reach to your Organization Administrator to be added to the “empowered” Spark users list.


Currently evaluating Cisco Spark? It is worth mentioning that you can ask for an official evaluation via Cisco Spark Plus (note that the online buying experience is limited to United States for now). The Spark Account email provided in the form below will get instantly promoted as an “Administrator” of the Organization that just got created.



Now, let’s experience your new Administration API super-powers.


Log out and sign in again at Spark for Developers portal. As you reconnected, a new Developer Access Token got issued, with Cisco Spark APIs administrator privileges.


Access the Organization interactive documentation, and list your organizations details.


What’s next ? You’ll certainly want to start writing scripts to audit your Cisco Spark Organization or automatically provision / de-provision users.


Well, when it turns to custom code writing, nothing beats the Code Generation feature of Postman.. except a full-feature Cisco Spark API client, but at the time of this article, none of the existing community client SDKs support the new Administration API … yet.


Good news! the DevNet Sparkers just updated the Cisco Spark Postman Collections project on github: check the Administration Collection. Fully scripted, the collection lets you invoke the new  Administration resources incrementally, generate code in a language of your choice, and browse the dynamically generated documentation.


You’re now a click away to importing the admin collection into Postman, and starting to code your new awesome CiscoSpark administration scripts. Happy hacking!


- Stève Sfartz, API Evangelist

This update applies to those sending or retrieving message attachments using the Spark API. Download and upload limits of message attachments are limited to 100MB in size - this effectively means you can upload a file 100MB or smaller, or download a file 100MB or smaller, but won't be able to upload OR download files that are larger. Larger files can be uploaded or downloaded via the Spark clients, this is specifically an API limit intended to prevent oversaturation of resources. Previously we documented files 2GB or smaller were supported, but that limit had to be scaled back after reviewing traffic patterns and load averages.


We focus on providing you the best experience and we apologize for any impact this has on you, your business and customers. We are working closely with teams to ensure any changes are known in advance.


If you have any support questions, feel free to reach out to us at or join the DevSupport Spark room. Follow us on Twitter to keep updated.


Related blogs:


- Spark for Developers Team

We’ll admit it: the Cisco Collaboration team is a little obsessed. When we’re not rolling out new products to help transform the way you work, we’re busy improving platforms you already use. Why? We believe in the power of continuously simplifying communication in all its forms, because we know the more intuitive and intelligent your tools are, the better you’re able to focus on the work that matters to you.


We’re especially excited to unveil the Cisco Spark Depot, our latest, greatest expansion to Cisco Spark. Simply put, the Spark Depot is an ever-growing catalog of integrations and bots for businesses of all shapes and sizes. Now, the apps you rely on, like Salesforce, Trello, Jira, Box, and more, can be instantly configured to connect to the Spark team-rooms of your choosing. This means less flipping between applications. But it also means that the work you do in one tool is seen by the rest of your team. With a little help from the Cisco Spark Depot, you’ll work smarter while you work together.



First, some clarification is in order. Bots versus integrations: What’s the difference? An integration provides basic notifications for a service, acting on behalf of a Spark user. On the other hand, a bot can perform tasks for any of the users in the room, presenting itself as another user or machine account. In addition to sending messages, a bot can post files and respond to messages — even join calls.


If you’re a developer, partner, or ISV, you might be wondering how you can contribute your own bot or integration to the Spark Depot. You can easily submit your ideas through the Cisco Spark for Developers portal here. Once you do, we’ll run both a business and technical review. Users can submit feedback, report issues, and even request new integrations and bots through the support link in your app page. Now that the Spark Depot is live, why wait to submit? You’ll help make Cisco Spark even better, and become an early, visible player in our expanding platform ecosystem.

Simple for consumers, simple for developers. Those are just two of the many strengths of the Cisco Spark Depot, and we can’t wait to see how you use it. Though the Spark Depot launched today, we’re already thinking about what’s next. Stay tuned and follow @CiscoSparkDev for the latest news.


- Jason Goecke, GM

Say “hello” to my little friend...



Since bots have been the popular buzzword this year, we want to make sure you’re not missing out on all the action. Collaboration tools are evolving and integrating fast. Whether you’re interested in building your own bot or integrating Spark with another app, here at Cisco we’ve got you covered.


I’m happy to announce the Cisco Spark Apps Community of Interest.  This is where you can start your journey as a “DevNet Sparker”.  You can explore integrations in DevNet Creations, learn how to build an app in the Learning Labs, connect with peers in #spark4dev room, and promote your own app in the Spark Depot. From the Spark Depot, your Spark app will get global visibility and hopefully the success it deserves.


Ready to get started? We have a ton of resources to help you build your first bot or integration.

  1. Check out the Spark Depot to start integrating Spark with popular tools like Box, GitHub, Salesforce and more.
  2. Learn how to use the Spark APIs to create your own bot and/or integration in the Cloud Collaboration Learning Track
  3. Jump straight into the code with SDKs and sample code in Awesome Cisco Spark


We hope you have fun creating some awesome apps. If you’re an engineer or developer wishing you could do something a little more creative, now is your time. Turn off a lightbulb from a Spark room, get notifications from a Jira update, find an available meeting room in your building. The sky’s the limit!


If you already have a bot that can be inspiring for others, go to DevNet Creations to share the code with the community. So glad to have you in the community



Adrienne Moherek



P.S. Don’t forget to check out the Spark Apps Community

This update applies to those using the People API. We are making enhancements to support one’s own avatar. This will be available in the coming weeks with more details.


We focus on providing you the best features and experience. If you have any support questions, feel free to reach out to us at or join the DevSupport Spark room. Follow us on Twitter to keep updated.


- Spark for Developers Team

Alice Cho

Announcing Admin APIs

Posted by Alice Cho Oct 21, 2016

Hello everyone,


We have received feedback from our growing partner community about Admin APIs. We are happy to announce the upcoming availability of Admin APIs. You may tentatively see them on the Spark for Developers portal in the next few weeks. With Admin APIs, an admin will be able to provision a user, assign roles, assign licenses, and view license usage of an organization.


We focus on providing you the best features and experience. If you have any support questions, feel free to reach out to us at or join the DevSupport Spark room. Follow us on Twitter to keep updated.  


- Spark for Developers Team

Alice Cho

Changes to the People API

Posted by Alice Cho Oct 21, 2016

Hello everyone!

This update applies to those using the People API. We are adding additional attributes to the /people resource. New attributes include firstName, lastName, and timeZone information of the people.

We focus on providing you the best features and experience. If you have any support questions, feel free to reach out to us at or join the DevSupport Spark room. Follow us on Twitter to keep updated. 

- Spark for Developers Team

Cisco Spark APIs comes with a great companion — its interactive documentation - that lets you quickly experiment with the API. If you have not tested it yet, take a minute to create a room.. and don’t miss the “Test Mode" toggle button where magic happens when you switch it on!


This documentation may fall short when it comes to code generation or if you need to experiment with more advanced features such as pagination. This is where a REST client tool such as Postman that can help.


We’re happy to announce the postman-ciscospark is now available on Github.


The project contains a Postman collection that let you generate code for any Cisco Spark API resources: pick a resource and get a code snippet for the language of your choice.

Nodejs code snippet to list Spark Rooms


Now you’re just a few clicks away from this new superpower: import and configure the collection to try it for yourself!

Note that this collection leverages the scripting capability of Postman, so that you can run queries with no need to copy-paste identifiers as you create new resources.


Not a Postman user yet? The REST tool comes with a free basic plan! Simply download a Chrome or Desktop version. If you have personal collections that prove to be handy over time, please feel free to add your contribution.


- Steve Sfartz

This year in San Francisco, the Spark4Devs team was there to watch as some talented coders used our APIs to hack their way through the competition and emerge with some pretty amazing projects!


Take for example, PepperHealth which uses Cisco Spark and other platforms including the Pepper humanoid robot to detect, monitor and alert healthcare workers to patients' conditions.


Or FoodWagon, a bot which uses the Sparkbot framework, "flint," to bring co-workers together during lunch time!


Both of these projects won prizes for their innovative uses of Spark during the TechCrunch Disrupt Hackathon. But even if the developers didn't win a prize, there was pizza for all, and that makes everyone feel like a winner.



After the hackathon, we spread the good news about Cisco Spark around the convention floor while startups competed for tech dominance, and we listened to some giants of the industry explain just a little bit about how they got so big.




Thanks to everyone who supported us during TCDisrupt! Until next time, check out the other teams that utilized Cisco Spark and give all of them some love!