A useful message?

I’d a big day ahead of me. I’d done all my preparation. I’d rehearsed the presentation, and prepared the room the night before. On the early train to London, I’d looked over my notes one last time, and visualised how things would pan out. What could go wrong?

Arriving in the conference room, I checked the room was as I’d left it the night before. It was. So I got myself a strong coffee and turned on the projector, only for the following to display on the screen.

“The bulb is coming towards the end of its useful life.”

As user assistance goes, it tells you exactly what you need to know, but it’s not what you need to see at 7:45am when you’re expecting a room full of people in little under an hour’s time!

Recruiting: the upside of management

Being a manager can be tough. Not only do you have to make difficult decisions that affect both you and you’re team, you have to deal with others doing the same for theirs. But occasionally you’re placed with a difficult decision that’s a joy to make. I was in this situation yesterday, following the move of one of my staff to another team.

I’ve been conducting interviews for her replacement since the New Year. The quality of the CVs that have passed my way surprised me, which made it more difficult to narrow the field down. A nice problem to have though.

Eventually we interviewed six, narrowing this down to three who came back for a second interview and technical test. Of those initial six, only one was a definite “No”, and only because they didn’t fit our office culture. They actually were the best technically.

Of the three who returned for the second stage, there were two stand out candidates. Both were fairly similar in experience, but had very different personalities and backgrounds. They both did well in the technical test, although perhaps one just shaded it. So who should I disappoint?

As I said management can be difficult at times, but this is the sort of decision every manager craves. Either of the candidates would have fitted in well and done the job. In the end it came down to who best fitted my needs. One candidate would have needed more support in the early stages. The other seemed more capable of working independently. As I’m not a micro-manager, that candidate got the nod.

Having analysed the facts, making the final decision was relatively easy. Coming from a strong position helped, but so did having such good candidates. I’m truly sorry that we didn’t have two vacancies. I’d have had no qualms about hiring them both. But I’ve no doubt the unsuccessful candidate will get another job soon.

Todoist: Focus on tasks based on your location

My Todoist workflow

It’s the start of a new year, and I’ve taken a few days leave. So what better time to evaluate my Todoist workflows. Todoist is my go to productivity tool. It keeps me productive, by reminding me of exactly what I should be doing at a particular part of the day.

However one of the issues I had with the way I’d set it up, was my need to perform tasks relating to both work and home life. The real problem though is that life is never as neat and tidy as you may want. For example, if I need to arrange a plumber to come and fix a dripping tap at home, I have to do this during office hours.

So how do I focus on work or home tasks in Todoist without making it a cluttered mess?

The key for me is the use of labels and filters. All my tasks use a label called “home” or “work”, and have a due date. Using these I’ve setup filters to focus to tasks due to be performed today, but also focusing on each label. This allows me to focus on work tasks when at work, and home tasks when at home. However I can also use the default “Today” view if I need to view all tasks side by side. Handy for calling that plumber.

Writing useful release notes

Writing about technical subject matter isn’t as easy as you may think. Writing to users about changes in a technical product, with the aim of informing them about exactly what has changed and how it affects them, is downright difficult.

Take the following example:

“Synchronisation has been improved. “

This only goes part way to telling users what has changed in the product, but leaves so many questions unanswered. Like:

  • Exactly what has been improved? Is there a better response time? If so, what is it now?
  • Has a bug been fixed? If so, what is it, and what has been changed to fix it?
  • Has new functionality been added? If so, where is it located, and where is more information on it?

The example above was from a set of release notes I recently came across, and clearly wasn’t written or proof read by a Technical Writer. If it was, they should be ashamed of themselves.

Better examples would be something like:

Synchronisation performance has been improved by up to 20%, after changing the FILENAME.JAR file to make less API calls.

File names containing a ? character no longer cause synchronisation to fail. See support ticket 123456 for further details.

Support has been added to synchronize with Product X. See the “Synchronising with Product X” page in the online help file for further details.

Armed with the above information, users can:

  • Decide whether to upgrade to the latest version, if it’s not automatically deployed.
  • Identify the risk to their organisation of deploying the change.
  • Develop a test plan before deploying across their organisation.

What is content?

It’s a question I frequently find asking myself and others at the start of a project.

Take one recent example when I was asked to input to a kick off meeting involving content inside our applications. With little by way of detail before the meeting, it was my first question. Getting an answer to this, involved lots of supplementary questions like:

  • What was the audience?
  • What level of detail they required?
  • When did they need it?
  • Why did they need it?
  • Where did we need to place it?

Armed with the answers, I could start to give meaningful suggestions on the content strategy moving forward. Without them, I’d be suggesting solutions without fully understanding why they were necessary. The end result would almost certainly be (at best) a solution that only partially met the objective.

This project highlighted the need for two distinct content types:

  • A more marketing / educational deliverable designed to make users aware of something (e.g. a new feature the first time they go into that area of the UI).
  • Specific problem solving content (e.g. how do I ensure the machine separates the blue widgets from the red widgets).

The project also means auditing the existing content to see what is already there, to highlight any gaps that need filling. It also enables us as a team to identify what content is out there that we weren’t previously aware of. Marketing may have material to address the first use case. Our Technical Trainers may have handouts and video tutorials also. Our online knowledge base should address the second use case, albeit in a generic fashion. If there are specific customer specific questions, we may need to look into how best to meet that need.

Wouldn’t it be powerful if once we’ve identified what’s out there, if we all took it upon ourselves to utilise it rather than reinventing the wheel. It encourages collaboration, and prevents that scourge of content providers everywhere: content silos.

So the easy part I’d over. The talking has finished and the content strategy is carved in stone. Now all we need to do is deliver it!

eWriter HTML to EXE File Review

EC Software GmbH, the Austrian company behind authoring tool Help+Manual, recently announced a free converter called eWriter. According to it’s own publicity:

“It allows you to package a complete HTML application (along with all included files like HTML, CSS, JavaScript, image, etc.) into an independent and executable Windows application.”

Originally designed as a solution to all those Compiled HTML (CHM) files that no longer worked on Windows machines, it uses a lot of the same functionality of CHM files. It also supports Unicode characters, HTML 5 and CSS3.

ewriter

Test Results

Matthew Ellison of UA Europe mentioned recently that he’d tried it out using WebHelp output from Madcap Flare. It worked well for him, so I thought I’d try the same using WebHelp output from Adobe RoboHelp.

There’s a good introductory video on their website should you need it, but no help file. Thankfully the software is easy to use. It is pretty much just specifying the source and output directories, and your desired output format (.EXE or EBOOK). There are configuration options that control the size of the window and what actions users can perform, and there’s a useful option of saving the configuration to a file should you need to repeat the process.

The Adobe RoboHelp project I used had DHTML elements, embedded multimedia files, as well as customised Javascript. It also had the output from 14 other merged WebHelp projects. So it was a pretty good test.

I used the .EXE output option. The generation was surprisingly quick considering the number of files involved. Once the .EXE file was launched, the output was displayed is a browser type window, but looks exactly like the WebHelp output would. All navigational elements worked as expected. Even our heavily customised search tool worked well.

Limitations

On the face of it, this seems like a useful tool in certain scenarios. However it does have some drawbacks:

  • Whilst it is possible to run some .EXE files on non-Windows machines, it isn’t something most users want to do. Therefore eWriter isn’t a viable solution if your users have an iOS device.
  • .EXE files themselves are problematic to distribute. Firewalls almost certainly flag them as suspicious, and maybe even reject them.
  • To get around the .EXE file problem, an option is available to output just the data to an .EBOOK file. This makes it easier to distribute, but users must have the appropriate reader application on their machines to open the file.

Conclusions

eWriter works well to package up any files in a directory into a single file. That in itself makes it very easy to distribute. It also displays the output in much the same way as the original output format.

However the limitations make this a nice to know solution. To most of us, it could prove useful at some point in the future, but isn’t right now. It’s one to place in your memory banks for when it does.

Technical Communication UK Conference 2018

The Technical Communication UK (TCUK) Conference took place in Daventry, UK last week. Run by the ISTC, it is the biggest conference in the UK for anyone involved in technical communication.

As someone whose attended (and in the past help organize) these conferences, is the changing role of Technical Communicators. It is almost as if the profession is trying to find where it fits best. Are we writers, illustrators, e-learning producers, or just editors. That is perfectly demonstrated by the array of subject matter on display in the conference’s agenda.

The Conference Agenda

The agenda saw presentations on how our profession can improve the marketing and user experience (UX). It also covered more technical topics like using Github and designing a Chatbot. There was also the perennial favorite topics like DITA and videos. All in all it has something to appeal to most of us.

So why didn’t I attend? After all, I had the budget for our team to attend.

Part of the reason is our workload. Our team has two major deliverables due the week after the conference. That said with some careful planning, we could have shoehorned in a couple of days away in Daventry. It would have been pretty full on, but we’d have coped.

No. My major reason for not going was the potential information on offer. As an industry, we seem stuck in a rut, unable to answer the question of identity I posed at the start of this post. This results in a conference agenda that covers a lot of subjects, but is of little practical use to my team.

There is the argument that covering topics that are irrelevant now, gives you knowledge that may prove useful later. That’s certainly true, but only if those topics are likely to be used in the very near future. If they’re not, it’s likely that the information will be out of date when you need it.

Hashtags and all that stuff

Another problem I had with the conference this year was the lack of good social media coverage. In the past there was reasonably good use of Twitter and subsequent blog posts. This year there seems to be near radio silence. even the tweets that did appear on the #tcuk18 hashtag didn’t offer a lot, as I pointed out in an effort to change things.

There were one of two people tweeting, but most of the tweets were short snapshots of words and phrases with little or no background information. We were left in the dark as to which presentation or even subject they related to. The result was more a summary for those that attended the conference, but no use at all if you weren’t there. A basic technical communication error!

Maybe it was the poor wi-fi that some reported on day one of the conference. If so, that should have been sorted. If it wasn’t, I hope the ISTC doesn’t return to the same venue until it is. Having a good internet connection at a conference is high on the list of “must haves” in my opinion.

So what next?

Personally I doubt I’ll be attending a TCUK conference anytime soon. It has always attracted a high proportion of self employed writers. It’s a great place to network with peers and potential employers. It also has a number of professionals in full time roles, often as a solitary Technical Communicator, but who crave meeting like minded folk.

That’s all cool, but for me it just doesn’t fit well with what I want. At the moment I’m looking into how my team will cope with:

  • An impending Salesforce integration, and whether we’ll use it or just deliver to it.
  • If we just deliver to Salesforce, what changes in technology are required.
  • Changes in the Engineering Department that affects how my team works.

Another priority for me is developing the Technical Communicator team. They are fairly young. They’re keen to learn, and have done a great job to date, but I want them to see what else the industry is doing. They would almost certainly have got more out of the TCUK Conference than me, but most of it would have been fairly useless to them going forward.

The long and the short of it is, if we’re going to invest £1000 for a delegate to attend, it has to deliver more than just nice to know information.

My Adobe FrameMaker journey

As a former Adobe Community Professional (ACP), I used  to post tips and tricks on various Adobe technical communication products. Mostly Adobe RoboHelp and Adobe RoboHelp Server, as I’d used them for over ten years. I’d also participated in beta releases, ensuring problems were discovered (and hopefully fixed) before release.

These days I don’t use any Adobe technical communication product, and had to forfeit  my ACP status. Three years ago I moved jobs to take on a management role, where the team use a community platform to author and host our online documentation. It’s not a perfect solution for my team or our users, but that is about to change. I’ll have more news on this early in 2019.

When I was an ACP, my job didn’t have a great need for Adobe FrameMaker, although we had a licence as part of the Adobe Technical Communication Suite. So when an opportunity came along to learn, I grabbed it with both hands. It involved a project compiling a large process and procedures document.

The initial brief was to author in Microsoft Word until I got involved, but only because they wanted PDF output and the powers that be didn’t know any better. I suggested that Adobe FrameMaker was a better fit, and once I’d explained the benefits this was accepted. I’d dabbled with Adobe FrameMaker in the past, but this gave me the opportunity to learn and use it properly.

Adobe FrameMaker was once described to me as the “Rolls Royce of authoring tools.” I’ve also heard it being called lots of other things, some of which is not repeatable here! I won’t say it’s perfect, well what application is, but a lot of the problems faced by those disgruntled users were caused by poor training or unrealistic expectations.

Part of the reason for this is their previous experience of applications like Microsoft Word. Microsoft were clever to design a product that was easy to use right out of the box. Perhaps too easy. It established a user base among folk who’d never even thought that something would replace the typewriter, let alone used a computer. All of a sudden everyone was creating documents…… badly!

That was OK so long as all they were doing was writing a letter or making notes. Even to this day, PhD students will scream and shout about how poor Microsoft Word is when writing their 300 page thesis. Whilst there are those who swear that Word templates can cope with files that size, it’s not straightforward to your average user.

Adobe FrameMaker does have a steep learning curve. I recommend new users to attend a course, or (like me) buy a good book. It’s well worth it to prevent having to reinvent your templates, and provides many a time saving tip. The Adobe FrameMaker Forums are also well worth visiting, with excellent support from real users.

Whilst I haven’t used Adobe FrameMaker in anger for awhile, I’ve kept a watching eye on its iterations. There’s another release imminent, Adobe FrameMaker 2019, and there’s a webinar planned to showcase what’ included in it. See the link below to register:

https://framemaker-2019-release.meetus.adobeevents.com/

Full details of what will be shown in the webinar is included in the Adobe TechComm blog post at:

https://blogs.adobe.com/techcomm/2018/08/framemaker-2019-release.html

In summary, the release includes:

  • A major platform update, including 64 bit architecture.
  • A new PDF generation engine that negates the need for PostScript or Adobe Distiller processing.
  • UI changes including:
    • A new better organised welcome screen.
    • The return of colour icons, with a choice to revert to monochrome.
    • Changes to make finding a colour or style easier.
  • Additional language support for German (Duden).
  • Improved image handing, including transparency.
  • Improved DITA and XML workflows.
  • Support for Microsoft SharePoint 2016 or SharePoint Online.
  • Support for Adobe Experience Manager 6.4

Todoist task formatting tips

I’m a seasoned Todoist user, an online to do list application that works across all devices and browsers. I’d hesitate to call myself a power user, but I do use it extensively both in my professional and personal life.

One of the reasons I love working in the IT sphere, is how applications you’ve used for awhile occasionally surprise you with what they can do. Todoist did that to me today when I watched one of Carl Pullein’s excellent productively YouTube videos. I’ve embedded it below for completeness.

In it Carl formats tasks so that they:

  • Don’t need a date / time scheduled.
  • Are formatted in bold. (Note: I’ve also discovered how to format in italics or both bold & italics).

It’s All About Those Asterisks

Task formatting is as easy as adding one or more asterisk. Check it out in the short video below.

The costs of poor communication, and how to tackle it.

There’s a old joke about doctor’s handwriting being illegible. These days that’s less of an issue, as patient notes and prescriptions are typed, but this has highlighted a different issue.

The BBC reported today on an initiative to get doctors to communicate with their patients in plain English (https://www.bbc.co.uk/news/health-45394620). The problem is manifesting itself in patients making appointments with their GPs, just to ask them to explain what a form of treatment they’re expecting means. The problem seems to be that patients referred to hospitals are receiving appointment letters full of medical jargon they don’t understand.

Take the following excerpt from a hospital discharge report I recently saw for someone I know:

"CTPA showed bilateral segmental and subsegmental PEs. Initial Troponin 
raised (46) repeat 11."

This was supposed to inform the patient and their GP what had happened to the patient whilst in hospital, and the delivered prognosis.

The problem here is the two audience addressed by the same deliverable. The patient’s GP will understand, but the patient likely won’t.

In instances like this, it is often easier to resort to resort to jargon. It’s the doctor’s own language after all. In just the same as two Network Engineers talking about DHCP or MAC Addresses, That’s fine so long as the audience is the same as them. Try involving an outsider though, and you’re asking for trouble.

You need two separate deliverables, based on the same content. That’s something most Technical Communicators understand and deal with on a daily basis, particularly in a software environment. Whether it is end users or administrators, English or Spanish speakers, you need to have the content for each audience generated from the same source.

Mark Baker asked the question on twitter recently why Technical Communicators find it so hard to explain our profession’s importance. It solicited a fair few responses, yet none really answered the question.

It’s an interesting question. We’re good at explaining things within our own specific spheres. We can even turn our hands at different spheres, but try to explain why we’re so important to others and we seem to struggle.

Case studies like the UK doctors help us, in that a direct effect of poor communication has resulting is wasted GP appointments and frustrated patients and doctors. By correlating the time and money spent having these appointments, we can monatise the problem. Armed with that information, we can argue how us working to resolve the issue can save the organisation money.

Maybe there’s a lesson for us there.