If you’re not the most technical person, all you need to know is that interactive experiences are
built using IntuiFace Composer
run using IntuiFace Player
(optionally) tracked using Data Tracking
(optionally) published, shared and deployed using IntuiFace Management Console
(optionally) controlled remotely from any handheld device using IntuiFace IntuiPad
The whole point of our platform is to remove complexity and end the days when interactivity required a PhD in software development.
IntuiFace Composer: The authoring tool
Composer is where you will create - using mouse and keyboard - your interactive experiences. An embedded version of Player (see next section) ships with Composer so you can test your work. No touch screen is required; think of your mouse as a single finger. For detailed technical specifications, click here. To see what an IntuiFace project looks like under the covers, have a look at our online documentation.
No-code paradigm: Never write a line of code. Every capability is designed for use by even the most non-technical of users. Unleashes creativity by removing technical barriers.
Uses existing content (images, videos, documents, 3D models, audio, etc.): No proprietary formats, no reinvention/recreation of existing design work.
Includes dynamic components such as a Web browser and maps: No “sandbox” limitations. Freedom to access and display external design work and information.
Access to external data sources and business components: Fast “development” of interactive interfaces to existing business components
Fine control of appearance and geometry: Full creative freedom. No template constraints or design restrictions.
Triggers and actions: Extensive library with hundreds of triggers and actions to choose from. Again – and we’ll keep saying it - without programming.
Supports asset binding: Mirror properties across graphic elements to create templates at design time and visually choreograph at runtime.
IntuiFace Player: The runtime
Player is the bit of software necessary to run all of the interactive experiences you create in Composer. One Player is required for every Windows, iPad, Android, ChromeOS, Samsung SSP and LG webOS device running an experience deployed in the field. For detailed technical specifications, click here. To see what an IntuiFace project looks like under the covers, have a look at our online documentation.
Support for remote gestures and any input peripheral: On Windows PCs, enables use of input devices like RFID/NFC readers and barcode scanners as trigger sources
Drawing tools: Draw on the screen while presenting
Inter-Player communication: Creation of multi-screen collaborative experiences. Without programming!
Visual remote control: Drive Windows-based experiences wirelessly using iPad and Android-based devices. Includes visual touchpad for full interaction and drawing at a distance.
Agnostic to display make/model: No display vendor lock-in. Freedom to change display make/model without having to change experience.
White label option: Take ownership of the splash screen when an experience is automatically launched.
IntuiFace Data Tracking: The measuring stick
Data tracking makes it possible for IntuiFace users to identify the preferences and - when available - demographics of those who use their interactive content. This data can even be collected in an environmental context, meaning you can capture information like location and weather, potential influences on user decisions.
Log virtually any event (e.g. user action, data input, environmental info, etc.)
Use unlimited parameters for each logged event, maximizing the richness of reported information.
Automatic event capture for experience and scene start/stop, enabling you to measure dwell time.
Identify sessions (coupled with improved RFID/NFC tag reader support) so you can differentiate users.
Real time upload of event information - aka data points - to a centralized, cloud-based Hub, permitting almost immediate access to data across even the largest deployment.
Quick access to data in Excel or your own database - local copies enabling fast analysis.
Automated integration with Google Analytics, Mixpanel and Segment, permitting data point transfer to virtually any analytics, marketing or data warehouse platform on the market.
Offline log storage for devices that lose or don't have Internet connectivity.
IntuiFace Management Console: The hub
Management Console makes it possible for you to manage experiences: publish to the cloud, share with others, and deploy to geographically distributed devices - all without leaving your desk. IntuiLab hosts the service so there is nothing for you to install and no advanced degree required. For details, click here.
Publish experiences to the cloud: One-click upload of your projects to any FTP, Amazon S3 or free IntuiFace Cloud Storage account.
Share experiences with colleagues and clients for editing: Just enter an email address. A sharing notification email is sent and Composer flags the newly shared experience as available for download.
Share experiences with clients for observation: Distribute a unique URL. Entering it in a browser on Windows, iPad, Android and Chrome will automate installation of a self-running version of your experience.
Automate experience deployment: Never again walk the halls or hit the road with USB stick in hand.
Control distribution: A real-time inventory of any Internet-accessible instance of IntuiFace Player - regardless of geography - for device-level deployment of experiences.
Access through Web-based control panel: Accessible through any Web browser running on any operating system. You don't even need IntuiFace Composer or Player running on the same machine to use it
Monitor with crash recovery: Restart experiences and Player itself if the remote device was rebooted. You can even see the active scene in the running experience on each device.
IntuiFace IntuiPad: The remote control
IntuiPad is your handheld window to a running IntuiFace experience. Using any iPad or Android-based device, this free software displays a running experience, captures your touches on the handheld display, and transmits those touches back to the experience. Walk amongst your audience as they view your presentation on a large, non-interactive screen. For detailed information, click here.
Visual Touchpad: Directly manipulate (touch, pinch, zoom, swipe) a reproduction of your presentation
On-Screen Annotation: Drawings on your handheld appear on all screens showing your running presentation
Automated Discovery: Locate and control any presentation located on your local intranet or Internet
Quick Select Toolbar: Easy access to common commands like Next and Previous
HTML5-based: Architected for rapid feature adoption and portability
Real-Time Performance: Near-zero lag time between handheld touches and results on the audience-facing displays
Multi-Touch Gesture Recognition Engine (MGRE): The Magic
(not visible in the graphic)
Patented technology: Connects input device events – touch, gesture, tag – with on-screen visual elements.
Endlessly scalable: Works with anything from a dual-touch display to 64+ touchpoints.
Married to interaction-centric computing model: On-screen events act as triggers in if/then relationships, resulting in rapid response through potentially complex visual choreography.
Portable: An HTML5 version is at the core of Player on iPad, Android, Chrome, Samsung SSP and LG webOS devices.