eWriter HTML to EXE File Review

EC Software GmbH, the Austrian company behind authoring tool Help+Manual, recently announced a free converter called eWriter. According to it’s own publicity:

“It allows you to package a complete HTML application (along with all included files like HTML, CSS, JavaScript, image, etc.) into an independent and executable Windows application.”

Originally designed as a solution to all those Compiled HTML (CHM) files that no longer worked on Windows machines, it uses a lot of the same functionality of CHM files. It also supports Unicode characters, HTML 5 and CSS3.


Test Results

Matthew Ellison of UA Europe mentioned recently that he’d tried it out using WebHelp output from Madcap Flare. It worked well for him, so I thought I’d try the same using WebHelp output from Adobe RoboHelp.

There’s a good introductory video on their website should you need it, but no help file. Thankfully the software is easy to use. It is pretty much just specifying the source and output directories, and your desired output format (.EXE or EBOOK). There are configuration options that control the size of the window and what actions users can perform, and there’s a useful option of saving the configuration to a file should you need to repeat the process.

The Adobe RoboHelp project I used had DHTML elements, embedded multimedia files, as well as customised Javascript. It also had the output from 14 other merged WebHelp projects. So it was a pretty good test.

I used the .EXE output option. The generation was surprisingly quick considering the number of files involved. Once the .EXE file was launched, the output was displayed is a browser type window, but looks exactly like the WebHelp output would. All navigational elements worked as expected. Even our heavily customised search tool worked well.


On the face of it, this seems like a useful tool in certain scenarios. However it does have some drawbacks:

  • Whilst it is possible to run some .EXE files on non-Windows machines, it isn’t something most users want to do. Therefore eWriter isn’t a viable solution if your users have an iOS device.
  • .EXE files themselves are problematic to distribute. Firewalls almost certainly flag them as suspicious, and maybe even reject them.
  • To get around the .EXE file problem, an option is available to output just the data to an .EBOOK file. This makes it easier to distribute, but users must have the appropriate reader application on their machines to open the file.


eWriter works well to package up any files in a directory into a single file. That in itself makes it very easy to distribute. It also displays the output in much the same way as the original output format.

However the limitations make this a nice to know solution. To most of us, it could prove useful at some point in the future, but isn’t right now. It’s one to place in your memory banks for when it does.

Technical Communication UK Conference 2018

The Technical Communication UK (TCUK) Conference took place in Daventry, UK last week. Run by the ISTC, it is the biggest conference in the UK for anyone involved in technical communication.

As someone whose attended (and in the past help organize) these conferences, is the changing role of Technical Communicators. It is almost as if the profession is trying to find where it fits best. Are we writers, illustrators, e-learning producers, or just editors. That is perfectly demonstrated by the array of subject matter on display in the conference’s agenda.

The Conference Agenda

The agenda saw presentations on how our profession can improve the marketing and user experience (UX). It also covered more technical topics like using Github and designing a Chatbot. There was also the perennial favorite topics like DITA and videos. All in all it has something to appeal to most of us.

So why didn’t I attend? After all, I had the budget for our team to attend.

Part of the reason is our workload. Our team has two major deliverables due the week after the conference. That said with some careful planning, we could have shoehorned in a couple of days away in Daventry. It would have been pretty full on, but we’d have coped.

No. My major reason for not going was the potential information on offer. As an industry, we seem stuck in a rut, unable to answer the question of identity I posed at the start of this post. This results in a conference agenda that covers a lot of subjects, but is of little practical use to my team.

There is the argument that covering topics that are irrelevant now, gives you knowledge that may prove useful later. That’s certainly true, but only if those topics are likely to be used in the very near future. If they’re not, it’s likely that the information will be out of date when you need it.

Hashtags and all that stuff

Another problem I had with the conference this year was the lack of good social media coverage. In the past there was reasonably good use of Twitter and subsequent blog posts. This year there seems to be near radio silence. even the tweets that did appear on the #tcuk18 hashtag didn’t offer a lot, as I pointed out in an effort to change things.

There were one of two people tweeting, but most of the tweets were short snapshots of words and phrases with little or no background information. We were left in the dark as to which presentation or even subject they related to. The result was more a summary for those that attended the conference, but no use at all if you weren’t there. A basic technical communication error!

Maybe it was the poor wi-fi that some reported on day one of the conference. If so, that should have been sorted. If it wasn’t, I hope the ISTC doesn’t return to the same venue until it is. Having a good internet connection at a conference is high on the list of “must haves” in my opinion.

So what next?

Personally I doubt I’ll be attending a TCUK conference anytime soon. It has always attracted a high proportion of self employed writers. It’s a great place to network with peers and potential employers. It also has a number of professionals in full time roles, often as a solitary Technical Communicator, but who crave meeting like minded folk.

That’s all cool, but for me it just doesn’t fit well with what I want. At the moment I’m looking into how my team will cope with:

  • An impending Salesforce integration, and whether we’ll use it or just deliver to it.
  • If we just deliver to Salesforce, what changes in technology are required.
  • Changes in the Engineering Department that affects how my team works.

Another priority for me is developing the Technical Communicator team. They are fairly young. They’re keen to learn, and have done a great job to date, but I want them to see what else the industry is doing. They would almost certainly have got more out of the TCUK Conference than me, but most of it would have been fairly useless to them going forward.

The long and the short of it is, if we’re going to invest £1000 for a delegate to attend, it has to deliver more than just nice to know information.

My Adobe FrameMaker journey

As a former Adobe Community Professional (ACP), I used  to post tips and tricks on various Adobe technical communication products. Mostly Adobe RoboHelp and Adobe RoboHelp Server, as I’d used them for over ten years. I’d also participated in beta releases, ensuring problems were discovered (and hopefully fixed) before release.

These days I don’t use any Adobe technical communication product, and had to forfeit  my ACP status. Three years ago I moved jobs to take on a management role, where the team use a community platform to author and host our online documentation. It’s not a perfect solution for my team or our users, but that is about to change. I’ll have more news on this early in 2019.

When I was an ACP, my job didn’t have a great need for Adobe FrameMaker, although we had a licence as part of the Adobe Technical Communication Suite. So when an opportunity came along to learn, I grabbed it with both hands. It involved a project compiling a large process and procedures document.

The initial brief was to author in Microsoft Word until I got involved, but only because they wanted PDF output and the powers that be didn’t know any better. I suggested that Adobe FrameMaker was a better fit, and once I’d explained the benefits this was accepted. I’d dabbled with Adobe FrameMaker in the past, but this gave me the opportunity to learn and use it properly.

Adobe FrameMaker was once described to me as the “Rolls Royce of authoring tools.” I’ve also heard it being called lots of other things, some of which is not repeatable here! I won’t say it’s perfect, well what application is, but a lot of the problems faced by those disgruntled users were caused by poor training or unrealistic expectations.

Part of the reason for this is their previous experience of applications like Microsoft Word. Microsoft were clever to design a product that was easy to use right out of the box. Perhaps too easy. It established a user base among folk who’d never even thought that something would replace the typewriter, let alone used a computer. All of a sudden everyone was creating documents…… badly!

That was OK so long as all they were doing was writing a letter or making notes. Even to this day, PhD students will scream and shout about how poor Microsoft Word is when writing their 300 page thesis. Whilst there are those who swear that Word templates can cope with files that size, it’s not straightforward to your average user.

Adobe FrameMaker does have a steep learning curve. I recommend new users to attend a course, or (like me) buy a good book. It’s well worth it to prevent having to reinvent your templates, and provides many a time saving tip. The Adobe FrameMaker Forums are also well worth visiting, with excellent support from real users.

Whilst I haven’t used Adobe FrameMaker in anger for awhile, I’ve kept a watching eye on its iterations. There’s another release imminent, Adobe FrameMaker 2019, and there’s a webinar planned to showcase what’ included in it. See the link below to register:


Full details of what will be shown in the webinar is included in the Adobe TechComm blog post at:


In summary, the release includes:

  • A major platform update, including 64 bit architecture.
  • A new PDF generation engine that negates the need for PostScript or Adobe Distiller processing.
  • UI changes including:
    • A new better organised welcome screen.
    • The return of colour icons, with a choice to revert to monochrome.
    • Changes to make finding a colour or style easier.
  • Additional language support for German (Duden).
  • Improved image handing, including transparency.
  • Improved DITA and XML workflows.
  • Support for Microsoft SharePoint 2016 or SharePoint Online.
  • Support for Adobe Experience Manager 6.4

Todoist task formatting tips

I’m a seasoned Todoist user, an online to do list application that works across all devices and browsers. I’d hesitate to call myself a power user, but I do use it extensively both in my professional and personal life.

One of the reasons I love working in the IT sphere, is how applications you’ve used for awhile occasionally surprise you with what they can do. Todoist did that to me today when I watched one of Carl Pullein’s excellent productively YouTube videos. I’ve embedded it below for completeness.

In it Carl formats tasks so that they:

  • Don’t need a date / time scheduled.
  • Are formatted in bold. (Note: I’ve also discovered how to format in italics or both bold & italics).

It’s All About Those Asterisks

Task formatting is as easy as adding one or more asterisk. Check it out in the short video below.

The costs of poor communication, and how to tackle it.

There’s a old joke about doctor’s handwriting being illegible. These days that’s less of an issue, as patient notes and prescriptions are typed, but this has highlighted a different issue.

The BBC reported today on an initiative to get doctors to communicate with their patients in plain English (https://www.bbc.co.uk/news/health-45394620). The problem is manifesting itself in patients making appointments with their GPs, just to ask them to explain what a form of treatment they’re expecting means. The problem seems to be that patients referred to hospitals are receiving appointment letters full of medical jargon they don’t understand.

Take the following excerpt from a hospital discharge report I recently saw for someone I know:

"CTPA showed bilateral segmental and subsegmental PEs. Initial Troponin 
raised (46) repeat 11."

This was supposed to inform the patient and their GP what had happened to the patient whilst in hospital, and the delivered prognosis.

The problem here is the two audience addressed by the same deliverable. The patient’s GP will understand, but the patient likely won’t.

In instances like this, it is often easier to resort to resort to jargon. It’s the doctor’s own language after all. In just the same as two Network Engineers talking about DHCP or MAC Addresses, That’s fine so long as the audience is the same as them. Try involving an outsider though, and you’re asking for trouble.

You need two separate deliverables, based on the same content. That’s something most Technical Communicators understand and deal with on a daily basis, particularly in a software environment. Whether it is end users or administrators, English or Spanish speakers, you need to have the content for each audience generated from the same source.

Mark Baker asked the question on twitter recently why Technical Communicators find it so hard to explain our profession’s importance. It solicited a fair few responses, yet none really answered the question.

It’s an interesting question. We’re good at explaining things within our own specific spheres. We can even turn our hands at different spheres, but try to explain why we’re so important to others and we seem to struggle.

Case studies like the UK doctors help us, in that a direct effect of poor communication has resulting is wasted GP appointments and frustrated patients and doctors. By correlating the time and money spent having these appointments, we can monatise the problem. Armed with that information, we can argue how us working to resolve the issue can save the organisation money.

Maybe there’s a lesson for us there.

Just a “normal” day in my life as a Technical Writer

The only usual thing about my days as a Technical Writer, is that it’s rarely usual. Today was no exception.

I manage the technical writing function, including another member of staff. Unfortunately for me, they started a week’s holiday today and I’ve just returned from two weeks away. The timing of our holidays isn’t ideal, but I’d monitored what was going on during my absence. I didn’t actually do much, but it ensured there weren’t any unpleasant surprises on my return.

My journey to work involves a 15 minute walk to the station, a train into central London, and a further 20-25 minute walk. I could get a bus or tube, but I figure that by the time I walked to the bus / tube, waited for said bus / tube to arrive, decide not to force myself onto an already crammed bus / tube, get on the next bus / tube, and walk to the office, I may as well just walk. Besides I enjoy walking.

Today I set off wearing a fleece, and soon regretted doing so. After the walk to the local station, it came off and never came close to being worn again. It was announced today that the UK had just experienced its joint hottest summer on record. The last week or two had seen a slight drop in temperatures and plenty of rain, but today demonstrated that whilst we are now in the meteorological autumn, it’s still warm enough for a short sleeved shirt commute.

Taking of surprises, whilst I was away, a desk move was announced. I knew this was happening, but unsure when. Whilst I was away it was announced it would be taking place on the Friday before my return. I’d asked my team member to ensure everything necessary was moved. This happened, but of course on arriving this morning I spent an hour ensuring everything was setup to my desired configuration.

A colleague last week had sent me a meeting request for 11am to discuss a forthcoming release. We’d completed all the updates some months ago, but the release was delayed. With the release imminent, she wanted to discuss some finer detail of what we’d provided. I went in search of her, not easy as she’d moved desks also, and managed to get the meeting moved to the afternoon. Enough time to organise myself and start trawling through my emails.

Within 30 minutes, I’d been approached by three separate Product Managers to inform me of projects in the pipeline. Only one of them is in any way urgent, likely to be released later this month. It doesn’t contain too much additional functionality, but annoyingly does contain new icons. I made a note to inform our Education Team who produce our certified video training programmes. They just love having to update all their videos every time an icon changes! We’ve got another large project completing about the same time that we’d both been working on. I may have to look at reallocating resources to ensure both project deadlines are met, once I’ve evaluated the effort involved. That’s not for today, so an item is added to my to do list.

Up to lunch was spent on admin. As a manager, admin is an essential part of what I do. It may sound boring, but having processes and ensuring they’re followed is pivotal to ensure a smooth running team. Running reports, trawling through my Inbox to prioritise any work items, or updating our project spreadsheets with details of changes / timescales, these tasks have to be done.

Around 12:30 I down tools for a trip to the gym in the basement of our office. I’m a keen runner, but have been recovering from a serious injury since late March. Today was my first day back on the treadmill, so I planned a gentle 30 minute jog. I want to get back to doing my local Parkrun ASAP. Being a Parkrun volunteer is fun, but just doesn’t give you to same buzz. I’m not looking to get back to anything like my PB just yet, but I’ve a goal in mind.

Back in the office, and I’ve time to eat a sandwich and apple before my Rescheduled meeting. The Product Manager and I discussed some minor changes to our best practice guides. A perfect chance to do some actual technical writing. As we discussed each setting, I changed the pages on the fly.

Afterwards I finished going through my Inbox. That only left the automated emails, comments generated by users, and those sent to our team’s distribution list, each of which is sent to a separate folder. The comments were easy, as not all are related to our Knowledge Base.

By mid afternoon, my earlier plea to my colleagues to come and eat a biscuit or two I’d brought back from my holiday, seemed to having some affect. The biscuit mountain was reducing, but not enough to avoid me taking the easy route to solving the issue of having the afternoon munchies.

Late in the day and another work request came my way. Four new projects in a day. That’s a record! Once again I update our project spreadsheet with the details. It’s a useful shared resource that the team and my boss can use to see what is coming up.

As folk began to leave for home, I pondered whether to stay late and finish off those unread automated emails. I consider logging on from home this evening rather than staying in the office. As a global company, and with my manager not being based in the Uk, I’m used to occasionally working irregular hours. However as today was a US holiday, I decided I’d have some time tomorrow morning to sort those email folders out before the USA woke up.

So it was a walk to the station, sans fleece. An unusually busy train meant a less comfortable journey than normal. My wife texted me to say she’d bought some milk and bananas, so I didn’t need to visit the supermarket on my way home. Bless her, as I hate that supermarket with a passion. It’s very handy for commuters as it’s only a minute from the station, but it’s layout and poor customer service makes a visit a soul destroying experience.

Home by 7:30, I change into shorts and sandals, take the bins out for the bin men who arrive early tomorrow, and sit down to eat. Nothing fancy tonight, but that’s OK. Plain food can still be tasty.

After catching up with my wife on her day at work, I spend time catching up on what’s been going on online. I normally try to do this during my train journeys, but today I didn’t. I predominately use Twitter for professional information, and Facebook for personal stuff. After a few minutes trawling, I had the idea for this post, but before I started, I prepared my lunch for tomorrow.

As my bed beckoned, my mind turned to what I can expect tomorrow. I know what I’ve left from today, but you can bet there’ll be the odd curve ball thrown in to make life interesting. Working for a software company in a dynamic environment is never boring.

Technical Communicators: There’s hope for us yet!

“I found that exercise rather depressing”, I said having participated in an exercise at a recent training session. Unsurprisingly my slightly tongue-in-cheek comment solicited a question from the trainer. “Why’s that?” To answer that, I need to explain the exercise.

We were given a scenario. We’re in a large city with a transport problem. There isn’t enough of it for those who want to travel. The answer is possibly hot air balloons! As the Head of Transportation, we’d one hour to research whether they really are the answer to all your problems. In order to do this, we had four options:

  1. Read the blueprints and instruction booklet?
  2. Watch other hot air balloonists and devise a plan?
  3. Meet with a subject matter expert and ask them questions?
  4. Just buy a balloon and try it out our self?

In our group only I went for option 1. Five went for option 2, with one other going for options 3 and 4.

Now do you see why I said what I said? As a Technical Communicator, I design how best to present the blueprint, and I write the instruction booklet. If no one but me would choose to even look at them if they’re in a hurry, what is the point in my profession?

OK so we’ve managed to buy another hour’s research time. What other research method of the three remaining choices would you choose? Six went for option 3, with one each for options 1 and 2.learning_styles

With eight people, it’s hardly a scientific sample, but it did raise some interesting insights into the different learning styles people have. According to Peter Honey and Alan Mumford, these are:

  • Option 1 = Theorist
  • Option 2 = Reflector
  • Option 3 = Pragmatist
  • Option 4 = Activist

As a follow up exercise, our group completed a questionnaire that aimed to demonstrate which of the four learning styles we best fitted. Guess what? The person who’d said they’d immediately just buy a balloon and learn from their mistakes, found they’d actually a high theorist score.

Ha! You can deny it as much as you like, but well designed and written technical documentation will always be needed. Especially for those who say they don’t read it.

Mimecast & Ataata: A TechComm match made in heaven

Today Mimecast announced it had bought Ataata, a cyber security training company. It’s a common sense acquisition for one of the leading cyber security companies, but from a technical communication perspective, it’s so much more.

Ataata provides short videos aimed at educating users about all aspects of cyber security. There’s lots of training companies who provide educational videos, but Ataata do so in a way that’s engaging and fun. Users love them. In fact they look forward to receiving the next one! Take a look at one and see for yourself.

Mimecast has been looking at how it can educate its users. We recognize that having the means to prevent threats from entering an organisation is only part of the solution. If you personally don’t fully engage in identifying where threats exist, you’re asking for trouble. In short, the weakest link is you.

I’ve been involved in projects at Mimecast looking at educating users about cyber security. We’ve embedded copy into the user interface to warn them about phishing attacks, and written white papers on steps companies can take to protect their data. It’s not just about using our cloud solution. This has been a focus of our CEO, Peter Bauer who’s been quoted as saying, “Our customers desperately need help training their human firewalls.”

It’s as a technical communicator that I’m interested in the Ataata acquisition. Our job is helping our users use our software, but achieving this is so much more than providing help. Mimecast recognizes the need to provide assistance where it’s needed most. Yes we provide online help, but we also provide embedded user assistance in the user interface.

This takes the form of text and video tutorials, but we’re also redesigning our user interfaces from the ground up. Gone are the dialogs with 20-30 fields and options, and in come wizard type dialogs with user friendly questions. By answering a few questions, we can identify the configuration a user needs, and set the options for them behind the scenes.

All this is designed to deflect support calls. Support calls cost money. End users spend time looking for content, and if they can’t find or understand the content, they contact our support staff. Whether it’s the cost to our users trying to complete a task, or our support staff dealing with queries, time is money. If we can prevent our users ringing us, it’s a win win solution.

This acquisition may not directly educate users in how to use our software, but the preemptive nature of Ataata’s solution means users should have less issues. The most expensive support call category involves data loss. Phishing, whaling, and impersonation attacks can take time (a lot of it) to recover from. Our software has solutions to prevent these attacks, but we want to provide another layer of protection.

In the ideal world, our cyber security solutions should never be needed. If we’re all vigilant 100% of the time, companies like ours wouldn’t be needed. The fact that companies like ours exist, shows how delusional that view is. Having a multi faceted approach to protecting your data is the way ahead, and from a technical communication perspective there’s so much scope to integrate our content in a fun and engaging way.

Choosing the “right” CMS

I’ve spent a couple of hours looking through the wish list items added by users of a well known CMS. What stands out, is how users want more from their CMS. Whereby in days gone by they were happy with a simple editor that allowed them to capture content, now they want (to name just a few trends):

  • A richer editing experience (e.g. indented list items, image / video, and table formatting).
  • The ability to import / export from / to more file formats.
  • Reusable content (e.g. variables and text snippets).
  • Smoother workflows for editing, reviewing, and publishing content.

What strikes me about this list, is how all of these could be achieved with one of the specialist technical writing applications. This begs the question why more content curators aren’t using them.

There’s no simple answer to this. It depends on the requirements and culture of the organisation. Maybe it’s a need to keep all the content in one place, or maybe it is ignorance of what a specialist technical writing application offers.

Whatever the reasons are for mot using one tool or another, there is one question everyone looking at CMS providers needs to ask. And it isn’t, “Can tool a do x, y, and z?”

The classic mistake many make, is to focus on the technology before considering the requirements. To use a slightly crude analogy, there’s little point in buying a family friendly Hyundai saloon car, and then wondering if it is the right car for a trip across the Sahara desert. Instead you should consider what you’d need by way of four wheel drive, storage for water / fuel, and the ability to pull yourself out of a sand dune. Once you’ve done that, then (and only then) you can look around at the vehicles best suited to your needs.

There are a variety of methods to ascertain if a CMS meets your needs. For example:

  • Search the product’s user forums.
  • Look for online user groups, particularly those not controlled by the vendor.
  • Look for wish list items to see what users are wanting that isn’t currently delivered.
  • Attend user group meetings or conferences.
  • Ask your peers.

If after all this you find yourself in a situation where you must host the content in a particular CMS, don’t be fooled into thinking you must author it there too. Look into the CMS’s import / export functionality. Even if this isn’t there out of the box, perhaps there’s an API that can help. Admittedly this normally requires additional resource from elsewhere in your organisation, but if the major stakeholders have the organization’s interest at heart, that shouldn’t be a problem.

Choosing a solution for your content curation needs isn’t easy, but an quick easy decision could prove disastrous and lead to a repeat of the exercise further down the line.

Adobe’s Technical Writing Trends 2017 Survey Results

Each year Adobe run a survey for Technical Writers. It asks questions about the tools, methodologies, and strategies used by those in the technical documentation industry, and tries to predict what will change in the future. They have a vested interest in the survey’s results, as they produce a number of applications used by Technical Writers. That said the results aren’t focused on their products, inside aiming to provide an honest picture of the industry. Having listened to the survey results in their recent webinar, here are my key takeaways.

Structured Authoring

It isn’t a surprise to see companies increasingly using structured authoring. It has many benefits, particularly for those with large content silos, multi-faceted output requirements, or translation needs. As such, the mean requirements for moving to a structured authoring environment were the ability to reuse content, apply consistency, and make it easy to update.

What is a surprise is the growth rate. According to this and past surveys, companies that have either adopted or were thinking or adopting structured authoring, has grown from 20% in 2012, to 50% in 2017. Predictably it is the companies with a 1000+ workforce that use structured authoring most, with over 50% of them adopting the methodology.

Benefits of structured authoring

DITA XML remains the stand out standard used to deliver structured content, with nearly 75% of structured authoring respondents either using it or likely to use it. Custom XML solutions come second with 47%. Again no real surprise, apart from the percentage of those focused on DITA XML. It has been around since 2001, but in recent years has been widely developed and adopted since IBM handed over the management to OASIS.

Output Types

PDFs are dead. Long live PDFs! Folk have been predicted the demise of PDFs to deliver content for years, but they’re still widely used. The survey shows a 90% dominance over all other output types. Responsive HTML5 comes a distant second at around 50%, although this is up significantly on recent years. That said, it looks like PDFs will be around for awhile yet. Not surprising really when you consider their advantages, and the lack of anything that even remotely matches their functionality.

Output Types

One interesting side discussion around PDFs was their lack of responsiveness. As we deliver content on a variety of devices, the need to make content usable on each and every device without having to create and maintain separate source files, is of paramount importance. The rise in usage of responsive HTML5 is testimony to this. Adobe more or less dodged the discussion on whether PDFs would be made more responsive. To be fair to Adobe, there wasn’t a great rallying cry from us users to make them responsive.

Personally I think most of us don’t even try to use PDFs where there’s a responsive requirement. We prefer to use a different output type that best fits our requirements. If we want responsive output that works on a tablet or mobile, are PDFs really the best output format? They can take up considerable storage space, and aren’t as user friendly as other delivery methods.

Other Usages

The later part of the webinar focused on a couple of areas that raised an eyebrow:


There are a small number of survey respondents using a chatbot to deliver technical content to users. It is fair to say that such delivery methods are still in their infancy, with many differing styles being used. It is also unclear how the tools we use to deliver content fit in with this delivery methodology. If this delivery model becomes more of a requirement, perhaps our tools will have to change accordingly.

I’m less sure of the need for our tools to provide chatbot functionality. There are already a number of applications that provide this. What is needed is a way to leverage our technical content inside these applications.

Alignment of Marketing and Technical Content

With the rise of structure content usage, I can see why the synergy between marketing and technical content has increased. If you can reuse content for multiple needs from a single set of source files, that’s a big vote winner. However it requires a real sea change in a company. Most companies I’ve worked for have an ongoing battle between the Marketing and Technical Communication departments. We both see the need for content reuse, style, and consistency, but have very different ways of achieving it. Part of the reason for this has been the tools we use to create the content. However if you move to structured content, the tools become less of an issue.

Adobe have a vested interest in getting more folk in an organisation to use their software. The trouble is their user base has historically been the Technical Writer community. With the growth of content marketing, it is natural that they see this as an opportunity to expand their user base. So far, I’d say this has only had limited success, but this is a long term strategy.


The results of the 2017 survey may not have provided any real surprises, but had enough of interest to make me sit up and take notice. The few thousand respondents provided a representative sample of our community, and covered a large number of job functions and levels of seniority.