This document discusses common myths about marketing automation vendor selection and implementation. It summarizes that while all systems may appear the same on the surface, important variations in features and capabilities exist. Integration is also more difficult than expected, and failure is often due to obstacles outside a user's control. The document advocates choosing a system only after thorough preparation, including defining requirements, testing integrations, and having a systematic implementation process to ensure the best chances of success.
12. Related Findings
• Training spend has little impact
– > users spend what’s needed
• Ease of learning criteria has little impact
– > users learn what’s needed
• Most satisfied considered only 2 systems
– > value is in prepping for the comparison
• Most satisfied deployed quickly
– > because had prepared in advance
• Most satisfied did not use outside resources
– > because trained their staff
25. Reality: Change continues.
• Environmental change
– IPOs, acquisitions, investments
– Shift upstream opens market space
– Growth in B2C may impact B2B
– Scope expanding to full customer journey
– Scope expanding to ad tech
– Scope expanding to external data
– Big data adds more analytics
– Shared customer databases
– Identity resolution via devices, tags, reference sets
26. Reality: Change continues.
• System change
– Social, content, predictive
– Mobile formats, apps, targeting
– Cross-channel execution
– Unstructured data
– Advanced attribution
28. Choosing Well: Basic Features
• Email
– Builder, templates, personalization, dynamic content, spam scores, reuse
• Landing pages
– Builder, progressive profiling, question types, next steps
• Behavior tracking
– Cookies, device IDs, anonymous to known, association
• Campaign flows
– Branches, test splits, actions, triggers, schedules, real time, contact limits
• Lead scoring
– Data scope, rule complexity, depreciation, point caps,
• CRM integration
– Connectors, custom tables, actions, APIs, sales interface
• Analytics
– Cross-channel, cross-campaign, attribution, forecasting, trends
• Technology
– Custom tables, APIs, data store, on-premise, data cleansing, enhancement
29. Choosing Well: Advanced Features
• Social
– Share to social, tracking, posting
– Social forms, track influence, external data
– Monitor & respond, promotions
• Content
– Builders, external discovery, shared library
– Classification/tagging, SEO scores, item-level results
– Multi-format (mobile, video, images, geo, etc.)
• Predictive
– Built-in or 3rd party
– Data types, prebuilt connectors
– Automated set-up & building, model types, self-adjust, user control
– reports, explanations
30. Preparing Well: Integration
• Assess existing systems
– Available data, access methods, potential frequency, data
quality
• Define your requirements (there’s that word again)
– Data sources, types of data (structured, unstructured, semi),
– data volumes, speed, scalability
– batch vs real time, triggers
– mapping, transforms, process flow, validation
– tech skills
• Select an approach (or several)
– System-specific connectors, generic connectors, open APIs,
platform systems
• Make part of marketing automation requirements
• Test early and often
31. Preparing Well: Process
• Systematic process
– Define real requirements
– Focus on features
– Test integration in advance
– Plan slowly, deploy quickly
• Expect change
– Look for flexibility as well as features
– Plan to replace in three years
32. Remember This
1. You can’t unchoose the wrong system.
2. Success depends on preparation.
(1) Intro – trouble in paradise;
- these should be great times for marketing automation: 50% growth rates, ever wider adoption, IPOs and acquisitions
(2) - but there’s trouble in paradise
[vale received]
- recently conducted survey where asked about marketing automation, and
- found 25% of users felt it wasn’t worth the investment -- not just vague dissatisfaction
- other surveys show similar results
- research also probed correlations between satisfaction levels and user behaviors
- found certain practices are more common among less satisfied users; strongly suspect they caused the dissatisfaction
- looking at this, see six myths that block marketing automation success
(3)- systems are all the same
- easy assumption, since they all look the same
- but think about your own business: you may look like competitors but know you’re different
(5) - but they’re really not [VEST data – overview and matrices]
- VEST slide showing features by system; note variation
- VEST matrices, showing that systems differences matter – different systems rate best for different users
(7) [consequences: careless selection; consider too few, pick too quickly]
– when people assume systems are the same, they don’t look very hard
- relate time spent, nbr considered to satisfaction
1. best/worst practices for selecting a systems
(9) [consequence: wrong criteria] ignore features; focus on ease & cost
- explain satisfaction color coding
- people who select on technology are unhappy,
- those who select on feature breadth are happy
- cost, learning, support don’t have much impact on result
- THIS IS IMPORTANT: shows that people who don’t pay attention to features are likely to pick the wrong system (lacking features);
- as with picking too carelessly, means you can make a mistake because systems aren’t all the same
..are NOT saying you need every feature; in fact, picking the right features is critical.
…but there’s more to worry about than features, which brings us to our second myth
(10) 2- integration is easy (since it’s a standard system feature)
- think integration is easy because it’s a standard system feature
(11) [Obstacles to Use]
- but in fact it ranks as the second most common obstacle and causes the most dissatisfaction
- also remember it was the most common evaluation criteria:
- so, unlike features, this is one that people did look for in advance
- what I take away from this is that it’s really hard to know in advance how well integration will work because
- you can’t test it like features
- integration depends on other systems outside of marketing automation, which may not be integration-friendly
- marketers aren’t very good at such things
- (we’ll see some support for that later when we look at big vs. little companies, and find that big companies have fewer integration problems)
- but just because it’s hard, doesn’t mean you shouldn’t try. We did see that people who had integration problems were a little happier if they had evaluated against integration than if they hadn’t. It may simply have been that no system could integrate with their current infrastructure. But at least you want to know about that in advance.
(13) [typical systems integrated]
- if you’re wondering what systems people integrated, here’s that data:
- 70% integrated CRM, and apparently that was mostly successful
- the real dissatisfaction came from people who tried to integrate BI and email, and that’s probably because those are core MA functions….so if you’re integrating an external system, that suggests you had a MA product that didn’t do what you needed – once more reinforcing the importance of selecting against features
- conversely, people who integrated with CMS and SEO were happy, presumably because they never expected their MA system to do that
- remember, this doesn’t tell us anything about people who tried and failed to do an integration – presumably, they were unhappy
(14) - failure is the user’s fault, not the system’s
Follows from the others: if any system can work, then failure must be the user’s fault
Common complaint, especially from vendors (not surprisingly)
(15) - [obstacles to success ] – obstacles that matter are beyond the user’s control
most common obstacle is learning the system, but that has high satisfaction, so shows is overcome.
Same for other high-sat items like organization barriers, staff training: use controls those and they are overcome
most harmful obstacles are integration, software cost, building infrastructure – those are beyond user control once system is chosen
But certainly is possible to do things wrong – point is, things the user controls can be fixed after the fact; system cannot “you can’t unchoose the wrong system”
(17) other findings, not shown, make the same point
Related findings
Training spend has little impact
> users spend what’s needed
Ease of learning criteria has little impact
> users learn what’s needed
Most satisfied considered only 2 systems
> value is in prepping for the comparison
Most satisfied deployed quickly
> because had prepared in advance
Most satisfied did not use outside resources
> because trained their staff
(18) – crawl walk run works
(19) - one of the more surprising findings:
- satisfaction went up with number of features [nbr features to start] [deployment time]
- interpretation is, reflects better preparation
- supported by finding that satisfaction was higher with faster deployment
- also reflects preparation
(20) [feature use]
- don’t misinterpret: is still logical sequence for feature deployment
- satisfaction data is tricky but basically shows that people want basic features from the start, are okay if never use some advanced, but are unhappy if have to drop
(21) 6- big companies are smarter
(22) [satisfaction by size]
- in fact, satisfaction is similar across sizes;
- was more by impact by industry: more common industries had higher sat (consulting / business services, IT vendors, telecom/ISP) presumably because had more experience
(24) [evaluation critieria by company size]
7. what big companies do better than small companies and where they fall short
- big companies do better job of selection, but still are less satisfied
- big companies use more technical criteria inc. integration, external connectors, APIs, features
- big companies are less likely to focus on cost or ease of use
not shown:
- big companies select more slowly
- big companies use more features & are more satisfied with advanced features
(26) [obstacles by company size]
- big companies are less likely to have complexity and integration problems but very unhappy when they do
- big companies have more problems with organization, staff, training issues
not shown:
- bigger companies deploy more slowly, use slightly fewer measures
- biggest companies (1000+) are more likely to use outside resources & are less happy when they do (suggests problems)
(27) 3- MA drives efficiency (least successful goal;
(28) - [success measures] – not web site traffic, conversion rates, or nbr of leads (beyond MA control e.g. mostly about new names)
- better on email response rates, ‘other’, campaigns per month (more within department control)
(30) [goals vs satisfaction]
- identify prospects and work more efficiently are least satisfied goals
- both reflect unrealistic expectations
- MA is basically about nurture and integration, doesn’t create new leads
- MA creates more work; efficiency requires process change
(32) [staff vs satisfaction]
- 2/3 add staff or consultants; happiest if they don’t
is another myth that should use same people for both
-highest satisfaction from using
- consultants for deployment (=setup),
- new staff for on-going (different specialties);
(33) - MA is ‘over’ (has stopped evolving)
(35) - external / environmental trends
- been lots of IPOs, acquisitions – do slow things down, but also bring new resources
- market dynamics: big companies move upstream, opens space for new/SMB entrants
- also seeing a lot in B2C, which might eventually flow back into B2B (B2C are more flexible)
- see need to integrate customer journey / experience; requires coordination across marketing / sales /service
- big data has some impact, more on analytics (but requires self-service; is outside of MA data)
- extention to prospecting
- ad tech integration is happening and is huge
- social and other data is worth mentioning again: for sales & service enablement as well as marketing; distinguish enhancement of known from net new prospects (remember that’s a weakness of MA)
(38) - internal trends
- social, content, predictive
- translate offers across media
- semantic analysis (manage unstructured data, auto-select campaigns/offers)
- advanced attribution
- mobile & local marketing
(39) What do you do when the myths are gone?
(42)
Features that Matter: Basic
3. what features drive success (selection criteria vs satisfaction)
what features should you look for?
- basics: email, landing pages, user tracking, campaign flows, lead scoring, CRM integration, analytics, technology
4. cutting-edge social and content features (not in survey..from VEST
(45) Features that Matter: Advanced
a. social features
- common: share to social buttons, track social response, create social posts
- frequent: social forms; track influence; build profile from external
- rare: monitor and response; create promotions e.g. contests, social sign-on
b. content marketing features
- external content discovery
- content library available to campaigns
- support for video, mobile, geo-specific, other new formats
- standard tagging for content selection & analysis
- track response by content item across campaigns
c. predictive modeling
- rarely built in; usually 3rd party, which assembles data and does modeling
- key considerations: types of data imported esp. via standard connectors; degree of automation in initial set-up and in building new models; types of models e.g. only response or also recommendations (best choice among many); self-adjusting over time; explanation of results;
(49) Integration Planning
So, how can you avoid integration problems?
5. how to integrate marketing automation with other corporate systems
a. integration options:
- prebuilt connectors to specific systems
- generic connectors e.g. Actian, MuleSoft, Jitterbit, Boomi, Scribe, SnapLogic, IBM-Cast Iron, Zapier
- open APIs (be specific about data import, data export, system functions, latency, volume limits, handling of custom data, REST vs SOAP
- platforms: from vendors w/open APIs; push-button (but often limited in depth)
b. integration considerations
- systems supported
- types of data handled (structured, unstructured, semi-structured)
- data volumes, speed, scalability
- batch vs real time; triggers
- features for mapping, transforms, validations, process flows
- tech skills required
(47) Recommendations/ what to do
- don’t even think past 3 year horizon
- take your time
- do your homework
- focus on features for short term
- plan ahead
- deploy quickly
- measure and adjust
- long term, plan for change – MA is evolving rapidly
- look for flexibility as well as features
- recognize won’t keep existing system for long
(50) Two lessons
- You can fix a lot of problems, but you can’t unchoose the wrong system.
- Choosing the right system, and final success, depend on preparation.