Content Management White Paper: Making Content Available

If your content system includes making content available for human users, you will likely want to provide some means for routing content to humans.

Content becomes available to your known users through permissions that enable them to access the content. Content can then be shared by permitted users to others outside your system through shares. (Assuming you have enabled such functionality.)

By changing the permissions that affect a given user’s account, you can change what content that user sees and accesses. (Optionally, of course, you can disable the account if you want the user to have no further access.)

Shares that are created by users can include access controls. Expiration dates, and whether to enable downloads and re-shares are common options. These shares can also be embedded into websites and other locations where you want the content displayed.

Another good way to make content accessible to those outside your content system is through the use of portals or microsites. In practice, there is little difference between the two, as far as users are concerned. In both cases, the user connects to a website location where content that resides in the content system is accessible, often without having to log in to an account.

Depending on system capabilities and configurations, users can search and browse for content, preview content, see metadata, download and share.

In some cases, the portal is a standalone site, meaning it has its own URL and, when loaded, the portal is all the user sees. In other cases, the portal can be embedded into an existing website page. This option is more popular when user interactivity does not require a fully featured interface, or when only a few pieces of content will be made available.

For example, your content system might be used to manage a million different pieces of content, but when users connect to your product documentation page, you want them to see only product documentation. In limiting what users see, you can improve their experience by preventing them from searching for what they need.

Routing content between business systems

An API (application programming interface) enables software developers to create add-ons to extend system functionality, build connectors to move data between systems, design alternate user interfaces, and more.

A content system designed “API first” was built with the expectation that it would be connected to other systems. By contrast, some systems have APIs that were later added to the core product or, in some cases, there is no API at all.

The Meaning of API First

“API first” does not mean a system has no user interface. It means only that API accessibility is considered a primary means for working with the system, so the API must enable remote systems to perform reasonable workflows, actions or processes within the system.

All APIs are limited in what they can do. In theory, an API enables a machine user to do in the system whatever a human user can do. In some cases, an API even provides even more functionality to machine users than to humans. For example, while a human user might see a menu option to create one new file, an API might enable a remote system to create a thousand new files in a single API call.

A well designed API considers the different use cases between human and machine users, and limits or extends available functionality accordingly. This is why an API-first system can be expected to do more in a multi system environment.

API Quality and Completeness

When evaluating content system components, do so with the assistance of a developer or other technical expert who can accurately assess the value of the component’s API. Some APIs offer only limited functionality or performance that might make them unsuitable for your plans and goals. In addition, it is important that sample code be made available in the programming languages your team uses. These software developer kits (SDK) can be the difference between your developers becoming productive immediately, or spending weeks researching how things should be done.

Content systems can be very valuable when used as hubs between a number of content-consuming business systems. In this capacity, they serve a number of important advantages:

  • Content is centralized, accessible to all systems that need it
  • Developers can rely on a single central system when considering content routing needs

That last point is important because a centralized content hub provides developers with a constant when connecting systems. In other words, instead of trying to figure out how to best connect two systems to one another directly, all they need to know is how to connect each of those systems to the content hub―a system that is already known. This removes a variable in each instance, which can lead to more reliable and effective results.

In addition, when each business system is connected to the central hub, this greatly reduces the numbers of system connections that are required. Consider having just three different content systems, one of which is the hub. In order for all three systems to share content, only two connections are required. But if the connections are system to system, three are required.

The math becomes more complex as more systems are added. For example, interconnecting five content systems without a hub would require some 10 different connections. Using a hub, that number is reduced to four. Not only does this save significant financial, development and time resources, it becomes far easier to maintain over time. A network of too many system to system connections can be more prone to technical issues, and far more complex to debug when something does go wrong.

Then, with all relevant systems connected, you gain the benefit of being able to route content between them. Reuse of content is common today, but virtually all content consuming systems treat content as a protected asset, unavailable outside that system. This is common with website content management systems (WCMS), marketing automation systems (MA), product information management systems (PIM) and more.

By contrast, when systems are connected via API, they can share information. This provides two important benefits:

  • Users are not required to add or update content in more than one system
  • A single system can serve as the master for content, providing a “single source of truth” that helps ensure accuracy and consistency

The concept of a master source of information is encouraged in information management theory, but this is not always the practice in multi-system software environments. The trouble is that, with each business system considering itself the master of all content it contains, users are forced to feed information to multiple “masters,” or arbitrarily choose one system to be the master of one type of data, while another serves as master for another type.

The most common solution in a multisystem environment is to grant the role of “master” to the system most closely associated with the information.

For example, product information is managed in the product information management system, which is recognized as the master source for that information. Marketing campaigns are managed in the marketing automation system, which serves as the master for that information. And the customer relationship management system (CRM) serves as the master for customer and prospect information.

Anyone who has connected a marketing automation system (MA) to a CRM knows the complexities involved. It makes some sense that contact information be managed in the CRM. After all, this is the point of that system. But the MA also uses that information. This means the data must be sent from the CRM to the MA, and updated as needed. But what happens when a customer updates her contact information based on an email that was sent from the MA? Logic dictates that the MA send the update to the CRM.

But what happens if both customer records have been edited since they were last synchronized? For example, the MA might have received a new mailing address, but a new contact phone was entered into the CRM. Both records are now updated, without the other system being informed. When the synchronization finally occurs, the system is left to decide which data is most recent.

More sophisticated integrations will be able to update records on a per field basis, which solves this problem in most cases. But this “simple” program involves only two systems. When you add other systems to the mix, such as a system that handles purchase information for that customer, or another that manages support requests, things become very complicated.

Before long, it becomes unclear to both machine and human users which system is the master of any given data, or even which data is most current or accurate. If you have merged records in a CRM, you might have experienced that hesitation while you considered which record held the most recent or complete updates of each data field.

The concept of master data management (MDM) refers to having a plan in place that describes, without ambiguity, the sources from where information comes, which system owns the information, and how information is shared to “downstream” systems that need it.

In an ideal world, a single system would serve as the master of all business content, be that customer or product information, marketing messaging and campaign results, technical documentation, contracts and orders, and everything else. But the diverse use cases for these content types suggest that there will not likely be such a system coming any time soon.

A next-best solution would be to abstract the content that is common to all systems so that it can be managed in a single system that serves as the primary content router between systems. From there, the content can be filtered down to the more specific systems from where additional information can be added and, perhaps, flow further downstream to additional systems or output channels.

In a way, you can think of this as a content pyramid, with the most common content at the top, and the more granular content below. Consider these examples of content that is of use across all systems, and should be consistent across all systems:

  • Copyrights, disclaimers or other legal notices
  • Contact data, such as emails, addresses and phone numbers
  • Official logos, executive headshots, or other such graphical content

Content like this needs to be managed by persons authorized to make changes, which won’t likely include the majority of your workforce. Further, that official and sanctioned content needs to be readily accessible when it is needed.

Without the use of a master system, copies of this content would be added to each system. When the source content is updated, each location where that content has been copied would have to likewise be updated. The problem is that this rarely happens. The result is that materials get released that include old logos, outdated office addresses, or worse.

One of the original promises of Digital Asset Management was that content like this could be stored in a central location from where people would get it when they needed it. The problem with this approach is that it does not alleviate the primary problem, which was that copies of the content needed to be added to other systems.

When systems are configured to pull content, on demand, from a master source, the chances of outdated content getting released are greatly reduced. For example, if your copyright notice is embedded as a link into templates or other locations where it is used, the users creating content with those template would not have to worry about whether the notice was accurate. Knowing it was coming from the official source, the user could rest assured that what would be published was accurate.

This concept is especially important when the source content is more complex, such as a end user license agreements (EULA) or terms of use. When a software developer is building an install wizard, she will not read the EULA to determine if it is current. In fact, she will likely have no idea what the software’s terms of use are—that is the job of the Legal department. All she wants is a link from where she can pull the most recent version of the EULA and include it in her build.

This is great example of more granular content into which more general content is embedded. It makes no sense to store software source code in a system not designed to manage source code, considering the unique requirements of software development. But it also makes no sense to manage copyright notices, disclaimers, EULAs and other content that will be included in software builds in systems where they cannot be reliably updated.

When considering all the business systems in use at your organization, consider what unique content is created with each, and identify the content that should be shared across all systems. This will help you illustrate a pyramid that defines the most reasonable master for each content type. And this, in turn, will enable you to plan the best ways in which to connect your systems.

Be the First to Learn.

Interested in getting notified about new blogs and other news from Picturepark? Follow us on Twitter, Linkedin or Facebook, and subscribe to our monthly newsletter.

Picturepark News

We'll send you a monthly update of what is happening with Picturepark and the Digital Asset and Content Management industry.