IntuiLab logo

How It Works

This is how the parts fit together.

If you’re not the most technical person, all you need to know is that interactive experiences are:
Created using 
IntuiFace Composer
Run using 
IntuiFace Player
Tracked using
IntuiFace Data Tracking
Published, shared and deployed using the IntuiFace Share and Deploy console
Controlled remotely from any handheld device using IntuiFace IntuiPad
Connected to any Web API thanks to IntuiFace API Explorer

A Day in the Life

At a high level, what do you do with IntuiFace? You Create, Publish, Deploy, and Run interactive experiences. But there is a lot of power underneath that simplicity. The video below looks under the covers and shows you all of the capabilities available at your fingertips. Underneath this video is a link that will open the entire diagram in a single view. Print it out and hang it on your refrigerator!

Open the IntuiFace Workflows complete image

Introducing the IntuiFace Family

IntuiFace Composer : The authoring tool

Composer is where you will create - using mouse and keyboard - your interactive experiences. An embedded version of Player (see next section) ships with Composer so you can test your work. No touch screen is required; think of your mouse as a single finger. For detailed technical specifications, click here. To see what an IntuiFace project looks like under the covers, have a look at our online documentation.
  • No-code paradigm: Never write a line of code. Every capability is designed for use by even the most non-technical of users. Unleashes creativity by removing technical barriers.
  • Uses existing content (images, videos, documents, 3D models, audio, etc.): No proprietary formats, no reinvention/recreation of existing design work.
  • Includes dynamic components such as a Web browser and maps: No “sandbox” limitations. Freedom to access and display external design work and information.
  • Access to external data sources and business components: Fast “development” of interactive interfaces to existing business components. Includes an API Explorer, enabling the no-coding creation of dynamic integrations with any REST-based Web Service.
  • Fine control of appearance and geometry: Full creative freedom. No template constraints or design restrictions.
  • Triggers and actions: Extensive library with hundreds of triggers and actions to choose from. Again – and we’ll keep saying it - without programming.
  • Supports asset binding: Mirror properties across graphic elements to create templates at design time and visually choreograph at runtime.

IntuiFace Player: The runtime

Player is the bit of software necessary to run all of the interactive experiences you create in Composer. Well it's a little bit more than a "bit of software" since It's truly a virtual machine that interprets the output of the Composer and makes the whole execution feeling like being a native app. One Player is required for every Windows, iPad, Android, ChromeOS and Samsung SSP device running an experience deployed in the field. For detailed technical specifications, click here. To see what an IntuiFace project looks like under the covers, have a look at our online documentation.
  • Support for remote gestures and any input peripheral: On Windows PCs, enables use of input devices like RFID/NFC readers and barcode scanners as trigger sources
  • Drawing tools: Draw on the screen while presenting
  • Inter-Player communication: Creation of multi-screen collaborative experiences. Without programming!
  • Visual remote control: Drive Windows-based experiences wirelessly using iPad and Android-based devices. Includes visual touchpad for full interaction and drawing at a distance.
  • Agnostic to display make/model: No display vendor lock-in. Freedom to change display make/model without having to change experience.
  • White label option: Take ownership of the splash screen when an experience is automatically launched.

IntuiFace Data Tracking: The measuring stick

Data tracking makes it possible for IntuiFace users to identify the preferences and - when available - demographics of those who use their interactive content. This data can even be collected in an environmental context, meaning you can capture information like location and weather, potential influences on user decisions. For details, click here.
  • Log virtually any event (e.g. user action, data input, environmental info, etc.)
  • Use unlimited parameters for each logged event, maximizing the richness of reported information.
  • Automatic event capture for experience and scene start/stop, enabling you to measure dwell time.
  • Identify sessions (coupled with improved RFID/NFC tag reader support) so you can differentiate users.
  • Real time upload of event information - aka data points - to a centralized, cloud-based Hub, permitting almost immediate access to data across even the largest deployment.
  • Quick access to data in Excel or your own database - local copies enabling fast analysis.
  • Automated integration with Google AnalyticsMixpanel and Segment, permitting data point transfer to virtually any analytics, marketing or data warehouse platform on the market.

IntuiFace Share and Deploy: The link to the outside

Accessible via our Web-based Management Console - home for a variety of functions including license and purchase management - is the Share and Deploy console which makes it possible for you to manage experiences: publish them to the cloud, share them with others, and deploy them to geographically distributed devices - all without leaving your desk. IntuiLab hosts the service so there is nothing for you to install and no advanced degree required. For details, click here.
  • Publish experiences to the cloud: One-click upload of your projects to a free IntuiFace Cloud Storage account or to your own FTP or Amazon S3 storage.
  • Share experiences with colleagues and clients for editing: Just enter an email address. A sharing notification email is sent and Composer flags the newly shared experience as available for download.
  • Share experiences with clients for review: Distribute a unique URL. Entering it in a browser on Windows, iPad, Android and Chrome will automate installation of a self-running version of your experience.
  • Remotely deploy experiences: Never again walk the halls or hit the road with USB stick in hand.
  • Control distribution: A real-time inventory of any Internet-accessible instance of IntuiFace Player - regardless of geography - for device-level deployment of experiences.
  • Access through Web-based control panel: Accessible through any Web browser running on any operating system. You don't even need IntuiFace Composer or Player running on the same machine to use it
  • Monitor with crash recovery: For Windows PCs, restart experiences and Player itself if the remote device was rebooted. Across all supported operating systems, you can even see the active scene in the running experience on each device.

IntuiFace IntuiPad: The remote control

IntuiPad is your handheld window to a running IntuiFace experience. Using any iPad or Android-based device, this free software displays a running experience, captures your touches on the handheld display, and transmits those touches back to the experience. Walk amongst your audience as they view your presentation on a large, non-interactive screen. For details, click here.
  • Visual Touchpad: Directly manipulate (touch, pinch, zoom, swipe) a reproduction of your presentation
  • On-Screen Annotation: Drawings on your handheld appear on all screens showing your running presentation
  • Automated Discovery: Locate and control any presentation located on your local intranet or Internet
  • Quick Select Toolbar: Easy access to common commands like Next and Previous
  • HTML5-based: Architected for rapid feature adoption and portability
  • Real-Time Performance: Near-zero lag time between handheld touches and results on the audience-facing displays

IntuiFace API Explorer: The cloud connector

API Explorer enables the no-coding creation of support for any REST-based Web Services query, opening the door to thousands of public and private APIs accessible via Web Services. That includes everything from movie listings and weather forecasts to currency conversion, the latest photos from NASA and all those connected objects among the Internet of Things. This thing is so powerful and unique that we're patenting it! For details, click here.
  • Supports entry of any request URL or curl statement; displays the contents of any XML or JSON-formatted response.
  • Uses a machine learning engine to accurately type each property (is it text? currency? webpage?) and to identify the values thought most likely to be important for display in your experience.
  • Automatically generates a dynamic connector (aka an "interface asset") for the specified query and creates a visualization, within your experience, of the properties you specified
  • Enables the automatic display of multi-page results in a single view, avoiding the need to manually sift through each page one at a time.
  • Permits on-the-fly modification of host, endpoint and query parameters to tailor your Web request for your specific needs

How It All Fits Together

​This is the first diagram we show teams interested in a 20,000 foot view of the IntuiFace family. We call it our markitecture - an architecture simplified for discussions with both technical and non-technical folks. Click the link below the image to get a larger version for your bathroom mirror.
Open a larger version of the IntuiFace Markitecture image

Unique Technologies under the hood

An industry-unique Multi-Touch Gesture Recognition Engine (MGRE) rapidly maps any number of concurrent input device events – touch, gesture, tag – to onscreen visual elements
  • Patented technology: Based on ground-breaking R&D that is unparalleled in the industry. See the patent here.
  • No touch limit: Works with anything from a single touch display to 64+ concurrent touch points across a multi-screen display.
  • Married to interaction-centric computing model: On-screen events act as triggers of if/then relationships, resulting in rapid visual responses with potentially complex choreography.
Operating system independence means you can play your experiences on multiple platforms without requiring any technical know-how
  • Portable: Build on Windows, run on Windows, iOS on the iPad, Android, Chrome and Tizen on the Samsung SMART Signage Platform.
  • Modern architecture: On Windows, IntuiFace is optimized to leverage native libraries to maximize speed. On all other platforms, HTML5 is used to avoid proprietary technology while achieving maximum performance.
  • Future proof: IntuiFace can easily accommodate new OS releases as the appear. And your work can always be run across platforms should your needs change.
Work with virtually any interactive approach you can imagine as both input and output. Learn more.
  • Internet of Things (IoT)
  • Beacons
  • NFC/RFID
  • Voice (both speech recognition and text-to-speech)
  • Sensors (e.g. proximity, motion, face detection)
Possess the power of programming without having to write a line of code.
  • Triggers & Actions: Think of it as 'when X then do Y'. Over 200 triggers and 200 actions can be combined.
  • Binding: Map values from any source - even sources external your experience - to visual elements inside IntuiFace.
  • Animation: Compose visual choreography in response to any trigger.
  • Excel as data storage: Use Excel as a poor man's database, storing information for display or offline study.
  • API access: Communicate, in real time, with any API-accessible data store, business service or device.
With machine learning, IntuiFace will learn your preferences and surface best practices from community input. Learn more.
  • API exploration: Remove the complexity of Web APIs by preselecting content considered most likely of interest to a designer.
  • Design assistance: Highlight design options well-suited to selected data and personalized for your aesthetics.
  • Layout wizard: Rapidly choose from visual options to speed your creation efforts.