Digital Experience and the Internet of Things

Recently I created a Digital Experience + Internet of Things concept.  It started when I found a really neat Internet of Things Foundation demo.  Give it a try, and you’ll see your smartphone moving in real-time using only your browser.

IoT Demo AppThis demo is part of IBM’s Internet of Things Foundation  – a service available on Bluemix, which is a platform that provides services like this along with “runtimes” to build applications.  A runtime might be a traditional Java server like WebSphere Liberty or more modern runtimes like Node.js.  Personally I enjoy writing applications in Node.js, but I’d rather build this application in IBM Digital Experience.  A few reasons why:

  • Security.  It seems like every Node.js sample application I see has some comment like /* your authentication code here */.  That stuff is hard … there’s a reason someone didn’t actually write it.
  • Context.  Say we build an IoT application and, we can see data like speed, temperature, movement, etc in real-time.  Alone that’s interesting, but if there’s a problem, a maintenance manual would be helpful.  Or maybe a list of online technicians with relevant skills to assist.  Combining context with content is powerful, but in the “app-only” model more scope means more code.
  • Integration.  An IoT platform is there to obtain and store information from the “things”.  To get that information, you need to use APIs – that’s an integration need.  And then there’s the parsing of data and web presentation and all the stuff – again – that needs to be as simple as possible.

Here’s the concept that I built.  It comes in two forms: an overview page with a list of devices deployed and a detail page that shows specifics about the device.

Let’s dissect the overview page.

IoT Overview PageThe chart and the table are web content.  And both make use of the Digital Data Connector (DDC).  In short, DDC fetches the data from the IoT platform.  It will also parse the incoming data – JSON in this case.  What happens next is the really cool part.  The web developer doesn’t touch the data.  They use placeholders in their HTML to refer to the data – no parsing, no “where did this come from”.  They just know that if they use the “id” placeholder the real value from the data will be shown.  Visually it looks like this: on the left is the web developer writing HTML in Digital Experience and on the right is the raw data a developer using an API would see.  The arrows show how DDC bridges the two.

IoT DDCWhat are some other fun facts?

  • The chart on the left, it’s using DDC and web content too.  But instead of emitting the HTML markup seen in the screenshot, it also emits Javascript code.  And that code works with a Javascript library called Chart.js.
  • The data is being delivered directly to Digital Experience and not to the browser.  This feature is known as the Outbound HTTP Proxy (formerly AJAX Proxy).  It’s an important point because A) to the user, it’s all coming from Digital Experience and B) not all services will allow browser to service (CORS) communication.
  • The Proxy I mentioned also supports authentication to external services.  I was able to exploit the Bluemix demo easily because the user credentials were visible in the web app.  Conversely, Digital Experience allows me to pass the user’s credentials or a shared credential (Credential Vault) to the IoT platform from the server rather than the web app.  Just one more thing that made this easier.

Next, let’s look at the detail page (click it to animate).

IoT Live ViewThe web content you see on the left is contextual.  This is the example I gave earlier – based on what I’m seeing, what else might be helpful to display?

The graph you see with my phone moving up and down is data that is being sent from the IoT platform.  I re-used the Chart.js library from the other page to graph the data points in real-time.  And these data points are being sent via an MQTT Javascript client that is communicating with the IoT platform.

To build the MQTT client, I used IBM’s Script Portlet.  The Script Portlet allows me to write a simple web application using nothing more than a browser.  (It’s like 80 lines of code!)

IoT Script PortletBut rather than use the web editor you see here, I developed the application locally.  This allowed me to use my favorite IDE.  When the app was ready, I simply pushed a button and published it in Digital Experience thanks to the local developer tools.

Now for the technical details.

To use DDC, I would suggest simply reading Stuart’s post on developerWorks.  Here are the properties needed for the WP List Rendering Profile Service.

IoT DDC PropertiesBe careful not to forget importing the SSL certificate for Bluemix and setting the AJAX proxy digital_data_connector_policy URL. Both are documented in the article.  To test the AJAX Proxy (Outbound HTTP Proxy) access the following URL.  You should get data back.

http://<your portal>/wps/proxy/https/

The hpaa.slotid=iot is what adds the shared credentials from the Credential Vault to the request.

The MQTT Client application is the following.

  <div style="display:none" data-script-portlet-original-tag="head">
     <script type="text/javascript" src="[Plugin:ScriptPortletElementURL element="js/require.js"]"></script>
<script type="text/javascript">
        baseUrl : "/"
    require(["ibmiotf"] , function(Client){
      console.log("loaded IOTF library");
        var data = {
            labels: [],
            datasets: [
            label: "Acceleration (Y)",
            fill: false,
            lineTension: 0.1,
            backgroundColor: "rgba(75,192,192,0.4)",
            borderColor: "rgba(75,192,192,1)",
            borderCapStyle: 'butt',
            borderDash: [],
            borderDashOffset: 0.0,
            borderJoinStyle: 'miter',
            pointBorderColor: "rgba(75,192,192,1)",
            pointBackgroundColor: "#fff",
            pointBorderWidth: 1,
            pointHoverRadius: 5,
            pointHoverBackgroundColor: "rgba(75,192,192,1)",
            pointHoverBorderColor: "rgba(220,220,220,1)",
            pointHoverBorderWidth: 2,
            pointRadius: 1,
            pointHitRadius: 10,
            responsive: false,
            data: [],

        var ctx = document.getElementById("myChart");
        var myLineChart = new Chart(ctx, {
        type: 'line',
        data: data
        var appClientConfig = {
            "org" : "play",
        "id" : "vans-iphone",
        "auth-key" : "<probably should get your own>",
        "auth-token" : "<ditto>"
        var appClient = new Client.IotfApplication(appClientConfig);
          console.log("loaded IOTF client " + appClient);
        appClient.on("connect", function () {
        appClient.on("deviceEvent", function (deviceType, deviceId, eventType, format, payload) {
            console.log("Device Event from :: "+deviceType+" : "+deviceId+" of event "+eventType+" with payload : "+payload);
            var json = JSON.parse(payload);
            // this is a hack to ensure when the device is offline that the chart does not
            // push new data entries
            if(json.d.ay != data.datasets[0].data[data.datasets[0].data.length-1]) {
<div data-script-portlet-original-tag="body">      
    <canvas id="myChart" width="300" height="150"></canvas>

Notice this snippet.

        baseUrl : "/"

I’m setting the base path for where requirejs will look for the ibmiotf module.  This means the ibmiotf.js file must be at http://<webserver>/ibmiotf.js for example.  In my setup, I placed it on the IBM HTTP server (htdocs folder).  I did this because I had difficulty with getting requirejs to play nicely with the Script Portlet.  The ibmiotf.js module can be found in the dist folder of /iot-nodejs on GitHub (iotf-client-bundle.min.js) .

Chart.js was added as a theme module and profile.  This allowed me to simply change the profile of the overview and detail pages to including the charting functionality.  Be careful that OS files don’t sneak their way to the server (._Chart.js seen in the screenshot).  This usually results in the theme code failing because there’s a foreign file it does not understand.

IoT Chart Theme Modue iot_profileHappy Coding!

Building Social Applications using Connections Cloud and WebSphere Portal: SAML and Single Sign-On

If you’ve been following along, we’ve created a custom solution that combines WebSphere Portal and Connection Cloud to create a socially enabled web site.  To access information in Connections Cloud, OAuth is used as the mechanism to exchange data.  Unfortunately, OAuth does not do everything.  If a user were to follow a link from Portal to Connections Cloud, he or she would need to log in to Connections Cloud.  What gives?

The reason is that WebSphere Portal is what authenticates to get the user’s data, but the actual user’s browser is not authenticated.  The result feels like a disconnect for users and non-technical observers.  The solution to this problem is SAML or Security Assertion Markup Language.  I’ve written about SAML on this blog previously, see Using SAML with SmartCloud for Social Business.  What I’ll do here is add to that work to create a solution that uses WebSphere Portal.


The general flow goes a bit like this:

  1. “Something” triggers the SAML process.  It could be by a strict precondition (like right after you login) or dynamically (something realizes that you have not yet authenticated).
  2. Send the user to a web application hosted on Portal.  This web application (the actual page a user visits) is designed to construct the SAML token.
    1. The SAML token is signed using a certificate previously exchanged with IBM.  There is a manual Support process that you must follow for this to work.
  3. The web page then POSTs the SAML assertion to Connections Cloud.
    1. Connections Cloud decrypts the token, inspects the user’s identity and allows access if appropriate.


Why would I ever do this?!?!  There are existing solutions I could use: Tivoli Federated Identity, Microsoft Active Directory Federation Server, Shibboleth.  But I needed something narrow in scope.  I don’t want an identity server – I already have one, Portal.  And I don’t want to add more servers to the existing deployment.  So I’ve created a module that does only one thing: determines who you are from Portal, creates a SAML assertion, and sends it to Connections Cloud.

That and I had the code laying around … it just needed a use case.


This is a fairly technical project.  Even my eyes glaze over when I starting hearing about ciphers and key chains, but the main moving parts are as follows.

SAML Servlet

The SAML Servlet listens for incoming requests.  In doing so, it will do the following:

  1. Figure out the user’s identity.  The web module is protected and thus all users must be authenticated by WebShere to access.
  2. Construct the SAML token.
  3. Generate a web page with the form that sends the token to Connections Cloud.  (Javascript will submit the form automatically for the user.)
    1. There’s also a bit of code that will realize the rediret (302) coming from Connections Cloud if the admin has configured to use only this servlet as the identity server.
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
public class Saml11Servlet extends HttpServlet {
private static final long serialVersionUID = 1L;
private Saml11ServletConfig config;
private ISamlSigner signer;
private SamlCreator creator;
private ISamlIdentityProvider idProvider;
private final String emailParam = "email.domain";
public void init() throws ServletException {
config = new Saml11ServletConfig(this.getServletConfig());
signer = new SimpleSamlSigner(config.getKeyStorePath(),
config.getKeyStorePassword(), config.getKeyStoreAlias());
creator = new SimpleSaml11Creator(config, signer);
// see
// for additional ways to get the email from a logged in user
// be careful as not all WebSphere servers use VMM (i.e. Federated Security)
idProvider = new PrincipaltoEmailIdentityProvider(this.getServletConfig().getInitParameter(emailParam));
protected void doGet(HttpServletRequest request,
HttpServletResponse response) throws ServletException, IOException {
// incoming 302 from Connections Cloud has TARGET parameter
String target = request.getParameter("TARGET");
String identity = idProvider.getUserIdentity(request.getUserPrincipal().getName());
String token = creator.create(identity, null);
response.getWriter().print(getForm(config.getEndpoint(), token, target));
private String getForm(String endpoint, String token, String target) {
return "&lt;html xml:lang=\"en\" xmlns=\"\"&gt;&lt;head&gt;&lt;meta http-equiv=\"Content-Type\" content=\"text/html; charset=UTF-8\"&gt; &lt;title&gt;SAML POST response&lt;/title&gt; &lt;/head&gt; &lt;body&gt; &lt;form method=\"post\" action=\"" + endpoint + "\"&gt;&lt;p&gt;&lt;input name=\"TARGET\" value=\"" + target + "\" type=\"hidden\"&gt; &lt;input name=\"SAMLResponse\" value=\"" + token + "\" type=\"hidden\"&gt; &lt;noscript&gt; &lt;button type=\"submit\"&gt;Sign On&lt;/button&gt; &lt;!-- included for requestors that do not support javascript --&gt; &lt;/noscript&gt; &lt;/p&gt; &lt;/form&gt; &lt;script type=\"text/javascript\"&gt; setTimeout('document.forms[0].submit()', 0); &lt;/script&gt; Please wait, signing on... &lt;/body&gt;&lt;/html&gt;";

Identity Provider

This is pretty basic.  I’m assuming that the user currently has a Portal session.  If so, I’m grabbing the Principal (e.g. wpadmin) and then appending a configurable email domain.  The email address is a required format for Connections Cloud.  Thus you can implement your identity lookup any way you want, but the final value must be an email address that is also stored in Connections Cloud.

public class PrincipaltoEmailIdentityProvider implements ISamlIdentityProvider {
	private final String domain;
	public PrincipaltoEmailIdentityProvider(String domain){
		this.domain = domain;
	public String getUserIdentity(String userId) {
		return userId + "@" + domain;

SAML Token Creator

Things are about to get interesting.  With the user’s identity, we need to construct the SAML token.  I’ve chosen a SAML 1.1 implementation … because it was easier.  This code simply takes the template XML and substitutes appropriate values.  The result at the end is really just XML.  But this is where mistakes happen.  The values used must be accurate in not only the value but also format (e.g. date).

import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.util.Date;
import java.util.Iterator;
import java.util.Map;
import java.util.Map.Entry;
import java.util.SimpleTimeZone;
import org.apache.commons.lang.StringEscapeUtils;
public class SimpleSaml11Creator extends SamlCreator {
	private static final String SAML_11_TEMPLATE = "&lt;samlp:StatusCode " + "Value=\"samlp:Success\" /&gt;$AUDIENCE$"
			+ "&lt;saml:AuthenticationStatement " + "AuthenticationInstant=\"$AUTH_INSTANT$\" AuthenticationMethod=\"urn:oasis:names:tc:SAML:1.0:am:password\"&gt;"
			+ ""
			+ "$USER_NAME$"
			+ "urn:oasis:names:tc:SAML:1.0:cm:bearer"
			+ "&lt;saml:NameIdentifier " + "Format=\"urn:oasis:names:tc:SAML:1.0:assertion#emailAddress\"&gt;$USER_NAME$"
			+ "urn:oasis:names:tc:SAML:1.0:cm:bearer"
			+ "$ATTRIBUTES$";
	public static final String ATTRIBUTE_FORMAT = "$ATTR_VALUES$";
	public static final String ATTRIBUTE_VALUE_FORMAT = "$ATTR_VALUE$";
	public SimpleSaml11Creator(ISamlConfig config, ISamlSigner signer) {
		super(config, signer);
	protected String getToken(String userId, Map&lt;String, String[]&gt; userAttrs) {
		Date now = new Date();
		SimpleDateFormat format = new SimpleDateFormat(DATE_FORMAT);
		Calendar cal = Calendar.getInstance(new SimpleTimeZone(0, "GMT"));
		long validLength = (long) 60000 * config.getTokenExpiration();
		// Setup the time for which SAML assertion is valid
		Date notBefore = new Date(now.getTime() - validLength);
		Date notAfter = new Date(now.getTime() + validLength);
		String saml = SAML_11_TEMPLATE.replace("$ISSUE_INSTANT$",
		saml = saml.replace("$RESPONSE_ID$", now.getTime() + "");
		saml = saml.replace("$AUDIENCE$", config.getEndpoint());
		saml = saml.replace("$RECIPIENT$", config.getEndpoint());
		saml = saml.replace("$ISSUER$", config.getIssuer());
		saml = saml.replace("$ASSERTION_ID$", now.getTime() + "");
		saml = saml.replace("$NOT_BEFORE$", format.format(notBefore));
		saml = saml.replace("$NOT_AFTER$", format.format(notAfter));
		saml = saml.replace("$AUTH_INSTANT$", format.format(now));
		saml = saml.replace("$USER_NAME$", StringEscapeUtils.escapeXml(userId));
		StringBuilder allAttrs = new StringBuilder();
		if (userAttrs != null) {
			Iterator&lt;Entry&lt;String, String[]&gt;&gt; iterator = userAttrs.entrySet()
			while (iterator.hasNext()) {
				Entry&lt;String, String[]&gt; entry =;
				boolean attHasValue = false;
				// Setup the attribute values
				String attrValues = "";
				String[] values = entry.getValue();
				// Make sure values is not null
				if (values != null) {
					for (int j = 0; j &lt; values.length; j++) { String value = values[j]; // Only add the attribute if there is a value to add if (value != null &amp;&amp; value.length() &gt; 0) {
							attHasValue = true;
							attrValues += ATTRIBUTE_VALUE_FORMAT.replace(
				if (attHasValue) {
					// Setup the attribute name and namespace
					String key = entry.getKey();
					String attribute = ATTRIBUTE_FORMAT.replace("$ATTR_NAME$",
					attribute = attribute.replace("$ATTR_NAMESPACE$",
					attribute = attribute.replace("$ATTR_VALUES$", attrValues);
		saml = saml.replace("$ATTRIBUTES$", allAttrs.toString());
		return saml;

SAML Signer

Now that there’s an XML SAML token, we need to sign it.  The signer class doesn’t do the actual signing; I’ve taken care of that implementation in the SAML Creator class.  The signer class’s job is to produce the X509Certificate.  I’ve chosen to simply pull it off disk from a configurable location.  A better implementation is to get it from built-in WebSphere keystores.  And since it’s not that interesting, I’ve left it out of the blog post (though it’s in the project code).

SAML Creator

And now we need to bring it all together. Take the XML, sign it with the certificate and hand it back to the servlet for posting to Connections Cloud.

import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.logging.Logger;
import javax.xml.crypto.dsig.CanonicalizationMethod;
import javax.xml.crypto.dsig.DigestMethod;
import javax.xml.crypto.dsig.Reference;
import javax.xml.crypto.dsig.SignatureMethod;
import javax.xml.crypto.dsig.SignedInfo;
import javax.xml.crypto.dsig.Transform;
import javax.xml.crypto.dsig.XMLSignature;
import javax.xml.crypto.dsig.XMLSignatureFactory;
import javax.xml.crypto.dsig.dom.DOMSignContext;
import javax.xml.crypto.dsig.keyinfo.KeyInfo;
import javax.xml.crypto.dsig.keyinfo.KeyInfoFactory;
import javax.xml.crypto.dsig.keyinfo.X509Data;
import javax.xml.crypto.dsig.spec.C14NMethodParameterSpec;
import javax.xml.crypto.dsig.spec.TransformParameterSpec;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
import org.w3c.dom.NodeList;
public abstract class SamlCreator {
	private final Logger logger = Logger.getLogger(SamlCreator.class.getName());
	public static final String DATE_FORMAT = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'";
	protected ISamlConfig config;
	protected ISamlSigner signer;
	protected abstract String getToken(String userId,
			Map&lt;String, String[]&gt; userAttrs);
	public SamlCreator(ISamlConfig config, ISamlSigner signer) {
		this.config = config;
		this.signer = signer;
	public String create(String userId, Map&lt;String, String[]&gt; userAttrs) {
		logger.fine("Creating SAML token for " + userId);
		String saml = getToken(userId, userAttrs);
		logger.finest("SAML token = " + saml);
		Document doc;
		try {
			DocumentBuilderFactory dbfac = DocumentBuilderFactory.newInstance();
			DocumentBuilder docBuilder = dbfac.newDocumentBuilder();
			ByteArrayInputStream is = new ByteArrayInputStream(
			doc = docBuilder.parse(is);
			logger.finest("Successfully converted SAML token to " + doc.getClass().getName());
			NodeList nodes = doc.getDocumentElement().getChildNodes();
			for (int i = 0; i &lt; nodes.getLength(); i++) {
				// Sign the SAML assertion
				if (nodes.item(i).getNodeName().equals("saml:Assertion")) {
					if (config.signAssertion()) {
						logger.fine("Signing SAML assertion element");
						signSAML((Element) nodes.item(i), null, "AssertionID");
					} else {
						logger.fine("Skipping signing of SAML assertion element");
			logger.fine("Signing SAML response");
			// Sign the entire SAML response
			Element responseElement = doc.getDocumentElement();
					(Element) responseElement.getFirstChild(), "ResponseID");
			// Transform the newly signed document into a string for encoding
			TransformerFactory fac = TransformerFactory.newInstance();
			Transformer transformer = fac.newTransformer();
			StringWriter writer = new StringWriter();
			transformer.transform(new DOMSource(doc), new StreamResult(writer));
			String samlResponse = writer.toString();
			logger.finest("SAML token = " + samlResponse);
			logger.fine("Base64 encoding SAML token");
			// Encode the string and return the response
			return new String(Base64.encode(samlResponse.getBytes("UTF-8")));
		} catch (Exception e) {
		return null;
	private void signSAML(Element element, Element sibling, String referenceID) {
		logger.fine("Signing element " + referenceID);
		try {
// this needs to be here due to Java bug in 1.7_25
            element.setIdAttribute(referenceID, true);
			XMLSignatureFactory fac = XMLSignatureFactory.getInstance("DOM");
			DOMSignContext dsc;
			if (sibling == null) {
				dsc = new DOMSignContext(signer.getPrivateKey(), element);
			} else {
				dsc = new DOMSignContext(signer.getPrivateKey(), element,
			DigestMethod digestMeth = fac.newDigestMethod(DigestMethod.SHA1,
			Transform transform = fac.newTransform(Transform.ENVELOPED,
					(TransformParameterSpec) null);
			List list = Collections.singletonList(transform);
			String refURI = "#" + element.getAttribute(referenceID);
			Reference ref = fac.newReference(refURI, digestMeth, list, null,
			CanonicalizationMethod canMeth = fac.newCanonicalizationMethod(
					(C14NMethodParameterSpec) null);
			List refList = Collections.singletonList(ref);
			SignatureMethod sigMeth = fac.newSignatureMethod(
					SignatureMethod.RSA_SHA1, null);
			SignedInfo si = fac.newSignedInfo(canMeth, sigMeth, refList);
			KeyInfoFactory kif = fac.getKeyInfoFactory();
			List x509Content = new ArrayList();
			X509Data xd = kif.newX509Data(x509Content);
			KeyInfo ki = kif.newKeyInfo(Collections.singletonList(xd));
			XMLSignature signature = fac.newXMLSignature(si, ki);
			logger.fine("Successfully signed element" + referenceID);
		} catch (Exception e) {
			// TODO: throw instead of catch


If all goes well, the result should look something like this.

Download Video: MP4


And you’ll probably need the project code.

This code is as-is for education purposes.  FWIW I’m not a SAML expert; so if you post a question, there’s a good chance I won’t know.

Happy coding.




Building Social Applications using Connections Cloud and WebSphere Portal: Social Portal Pages

We’re going to use Portal’s theme framework to add the necessary CSS and JS files to our social pages.  Using this approach, we’ll no longer need to include the dependencies in our script portlets.  Pages that have social script portlets on them can simply have the relevant theme profile applied.  Another benefit is that by using Portal’s profile feature, the various browser requests are centralized into a single download to reduce the time taken to load the page.

Creating the Theme Modules

Let’s begin by adding new theme modules.  The modules will include the following resources on the page:

  • The Social Business Toolkit SDK’s Javascript dependency, for example /sbt.sample.web/library?lib=dojo&ver=1.8.0&env=smartcloudEnvironment
  • CSS files from Connections Cloud, for example /connections/resources/web/_style?

You can read how to create the module framework in the Knowledge Center.  Since the CSS files are located on a remote server, I need to create a “system” module.  This is essentially creating a plugin with the relevant extensions.  It’s a web project (WAR) with a single plugin.xml file.  The contents of my plugin.xml are as follows.

<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.4"?>
<plugin id=""
name="Social Business Toolkit Theme Modules"
value="Social Business Toolkit SDK">
value="Social Business Toolkit SDK">
value="{rep=WP CommonComponentConfigService;key=sbt.sdk.url}/sbt.sample.web/library?lib=dojo&amp;ver=1.8.0&amp;env=smartcloudEnvironment">
value="{rep=WP CommonComponentConfigService;}/connections/resources/web/_style?">
value="{rep=WP CommonComponentConfigService;}/connections/resources/web/_style?">
value="{rep=WP CommonComponentConfigService;}/connections/resources/web/_lconntheme/default.css?version=oneui3&amp;rtl=false">
value="{rep=WP CommonComponentConfigService;}/connections/resources/web/_lconnappstyles/default/search.css?version=oneui3&amp;rtl=false">

You could use the actual server’s path, for example<some css resource> in the XML. But I’m using a substitution rule

{rep=WP CommonComponentConfigService;}

that will swap the corresponding keys in the plugin.xml for the values defined by WebSphere’s Resource Environment Provider.  The only reason I did this was so I could configure the URLs from WebSphere rather than hard code them into the plugin.xml.


The other thing I’m doing is telling the SBT SDK which environment I want configured by referencing sbt.sample.web/library?lib=dojo&amp;ver=1.8.0&amp;env=smartcloudEnvironment.  This alleviates me from having to manually specify the endpoint in the SBT scripts I write later.  And notice the ampersand symbol amp semicolon format.  You’ll need to escape the ampersands in the plugin.xml.

Create your web module and deploy to your server.  You can use the Theme Analyzer tools in Portal’s administration interface to pick up the new modules.  Just go to the Control Center feature and invalidate the cache.

Invalidate Theme

Then review the system modules to locate the sbt_sdk one.

sbtSdk Module


To actually use the module, we need to build a theme profile.  A profile is a recipe of which modules should be loaded for a particular page’s functionality.  In addition to the sbtSdk module, we’ll need other IBM provided or custom modules loaded for pages to properly work.  Profile creation is rather straightforward.  You can use the existing profiles as a starting point.  See those in webdav for example; I use AnyClient to connect to my Portal  Once there, you can peruse the profiles under the default theme.

I’ve created a SBT Profile that includes the SDK and Cloud modules I created earlier.

 "moduleIDs": ["getting_started_module",
 "deferredModuleIDs": ["wp_toolbar_host_edit",
 "titles": [{
 "value": "Connections Cloud",
 "lang": "en"
 "descriptions": [{
 "value": "This profile has modules necessary for viewing pages that contain portlets written with the Social Business Toolkit SDK and Connections Cloud banner integration",
 "lang": "en"

This JSON file is then added to my default Portal theme using a webdav client.

SBT Profile WebDav

You’ll likely need to again invalidate the theme cache for the profile to be available for the next section.

Page Properties

To enable the profile on a page, we need to update the page properties.  The result of this process is that the aforementioned Javascript and CSS files get added to any page that has the profile enabled.

SBT Profile

And that’s it.  Now any developer can begin authoring “social” script portlets with nothing more than the page profile and a bit of web code.




Building Social Applications using Connections Cloud and WebSphere Portal: SBTSDK

To build social applications, I strongly suggest using the Social Business Toolkit SDK.  While Connections Cloud has simple to use REST APIs, the SDK provides a ready to go foundation.  Admittedly, the SDK is large and can be confusing at first.  In it you’ll find code for a variety of technology platforms: Domino, iOS, PHP, Java, Javascript.  We, developers, tend to want to jump right in and start code.  Don’t.  If you don’t start with the SDK, you’ll find yourself building authentication mechanisms, HTTP clients, XML readers, etc … stuff you really don’t need to re-invent.

There are two ways to get the Social Business Toolkit SDK:

  • Download the SDK and sample web applications from OpenNTF.   Since the SDK and samples are pre-packaged, this allows you to simply install and get going.
  • Download the SDK source and build the projects from GitHub.  While this takes more time, the code is more recent.  The last package posted on OpenNTF is over a year old at the time of writing.

Downloading the Social Business Toolkit SDK from OpenNTF

To download and install, see documentation.  High level steps are:

  1. Download the latest version of the SDK.
  2. Extract the zip and locate the file \samples\ear\sbt.sample- (or similar name).

Building the Social Business Toolkit SDK from GitHub

The documentation on building the source is a bit dated.  Most developers will be able to follow the steps below.

  1. Download the source (e.g. download ZIP) from the SocialSDK repo on SBTSDK GitHub.
  2. Import the maven projects into Eclipse.  The Import -> Maven -> Existing Maven Projects is an option in later versions of Eclipse.  If you do not have this, consider downloading the M2Eclipse plugin.
  3. Depending on how you import, you may need to update the Project Explorer view to include the working set.Maven SBT Working Set
  4. Select the sbt.sample project -> Export -> EAR File.

Installing the Social Business Toolkit SDK Sample

  1. Install the EAR (either the one from Downloading Step 2 or Building Step4) using WAS admin console.  I’m installing the SDK directly to the Portal server.
  2. Verify the application by visiting http://<portal>:<port>/sbt.sample.web/ in a web browser.  (The default port for Portal is 10039.)

SBT Home


Congratulations, now you can get started.

Building Social Applications using Connections Cloud and WebSphere Portal: Your First Social Portlet

Part of the series Building Social Applications using Connections Cloud and WebSphere Portal.

Create a Sample Application

Let’s start simple.  We’ll re-use on the SBT SDK’s sample applications inside Portal.

  1. Go to the SBT SDK application http://<portal>:<port>/sbt.sample.web/javascript.jsp.
  2. Authenticate using OAuth if you did not do so previously.  Do this by clicking the “Login” button on the Authentication -> Authentication Summary sample.
  3. Next select the Social -> Files -> Get My Files sample.
  4. If an error occurs, update line 18 to include the smartcloudOA2 endpoint in the service.
    var fileService = new FileService({endpoint: "smartcloudOA2"});
  5. Click the “Run” button.  You should see a list of files.  (If not, just make sure there are actually files in the My Files area of Connections Cloud for this user.)
  6. Keep the sample open, we’ll use it in Portal.

SBT My Files Sample


Adding the Sample to the Script Portlet

Now we’ll take the same code seen in the Javascript tab and add that to the Script Portlet.  The effect is exact same as the Get My Files sample – only that it’s coming from Portal.

  1. Log in as an administrator or a user with edit rights to Portal pages.
  2. Create a new page in Portal.
  3. Add the Script Portlet to the page from the content palette.  (If you get an error when doing this, confirm the site mapping for the page in the steps below.)
    1. In the Page Properties, select Web Content -> Edit.
    2. Click “Add Web Content”.
    3. Navigate to Libraries -> Script Portlet Library -> Script Portlet Applications.
    4. “OK” and re-add the Script Portlet to the page.
  4. Click “Edit” in the Script Portlet.
  5. Fill in the HTML and Javascript tabs in the Script Portlet with the respective tabs in the SBT SDK sample.Script Portlet JS
  6. Open a new browser tab and navigate to http://<portal>:<port>/sbt.sample.web/library.  You will receive Javascript as a response.  Add this Javascript to the beginning of the Javascript in the Script Portlet.
  7. Save the Script Portlet.
  8. Exit “Edit Mode” to view the page.

You should now see the same list of files that you saw in the SBT SDK sample on your Portal page.  It may not be pretty, but it is functional.  Try copying other examples from the SDK sample application to Script Portlets.

Get My Files Portal

Some may be wondering why I needed to copy the Javascript located at http://<portal>:<port>/sbt.sample.web/library into the Script Portlet.  This is required to:

  • Ensure the endpoint smartcloudOA2 is available
  • Set the AMD paths to the SDK’s modules.  If this was not done, we’d see the SDK trying to load modules from a Portal context.  Rather they must be loaded from the SDK enterprise application.

Copying this directly into the Script Portlet isn’t ideal.  What if we have two Script Portlets, do we copy into both?  One solution is to include this requirement as an external Javascript resource by adding the http://<portal>:<port>/sbt.sample.web/library URL to the list of dependencies.

SBT Dependency

Another solution I’ve found is use Portal’s page profiles to specify when this bit of code needs to load.  Thus when a Script Portlet containing the SBT SDK exists on the page, I change the page’s profile such that it loads this required Javascript.  I’ll save that discussion for a more advanced post.



Building Social Applications using Connections Cloud and WebSphere Portal

In this series, I’ll explore creating social applications using WebSphere Portal and Connections Cloud.  Specifically, I’ll focus on leveraging the following IBM products and resources:

  • WebSphere Portal (or IBM Web Content Manager)
  • Connections Cloud
  • the Social Business Toolkit SDK
  • WebSphere Portal’s new Script Portlet

To be clear, there are various ways to create a social application in Portal.  Consider reviewing the Redbook Building and Implementing a Social Portal for other options.  If you’d like an IBM off-the-shelf solution, read the Redbook.  But if you’d like to see an alternative approach, follow along:

  1. Downloading and Building the Social Business Toolkit SDK
  2. Getting Started
  3. Your First Social Portlet
  4. Social Portal Pages
  5. Single Sign-On with SAML
  6. Chat-as-a-Service (Coming Soon)

Building Social Applications using Connections Cloud and WebSphere Portal: Getting Started

Part of the series Building Social Applications using Connections Cloud and WebSphere Portal.

Installing the Script Portlet

We’ll be using WebSphere Portal’s Script Portlet.  The reason is that the Script Portlet is ideal for small, web-centric applications.  Much of the Social Business Tookit’s examples are exactly that – small, web-centric apps.  So it will be easy to use the Script Portlat and Toolkit as a starter for social applications.  Also by using the Script Portlet, the technical barrier to creating these applications is lower – assuming you don’t live and breath J2EE portlet development.

If you do not already have the Script Portlet installed, see documentation.  High level steps are:

  1. Download the Script Portlet from the Greenhouse Catalog.
  2. Unpack the downloaded zip and move the scriptportlet-app-1.3.0.paa file to the Portal server.
  3. Run
    ./ install-paa -DPAALocation=/path/scriptportlet-app-1.3.0.paa -DWasPassword=password -DPortalAdminPwd=password
  4. Run
    ./ deploy-paa -DappName=scriptportlet-app -DWasPassword=password -DPortalAdminPwd=password
  5. Restart Portal.

Configuring Connections Cloud

The Toolkit will use OAuth to communicate with Connections Cloud and retrieve data.  To do that, you’ll need to add an “Internal App” in Connections Cloud.

  1. Log in as an administrator or app developer to Connections Cloud (usually
  2. Click Internal Apps -> Register App.
  3. Provide a name and select the OAuth 2.0 radio button.
  4. Set the callback URL to the server where you installed the Toolkit.  For example, my server is
  5. Click Register.
  6. Back on the Internal Apps page, select the drop down for the app you created.
  7. Click “Show Credentials” and click the “Show Client Secret” link.
  8. Leave this screen open; the details will be used next.

OAuth2.0 Settings

Configuring the Social Business Toolkit SDK

Previously, we installed the SBT SDK.  Now it must be configured to work with the Internal App we created.

  1. Using a text editor, create the file
  2. Copy the following into the file.  You will need to update the section “SmartCloud OAuth 2.0 Endpoint Parameters” with settings from the “Show Credentials” screen on the “Internal Apps” page.
    1. # IBM Social Business Toolkit Configuration
      # Library Servlet Configuration
      # SmartCloud OAuth 2.0 Endpoint Parameters
  3. Since this tutorial focuses on Connections Cloud and OAuth, we’ll remove the SDK’s default environments and additional authentication options for clarity.
    1. Create a file called managed-beans.xml with any text editor.
    2. Copy the following into the file.
      <?xml version="1.0"?>
      <!-- Credential store physical implementation -->
      <!-- Default Environment -->
      <!-- SmartCloud OAuth 2.0 -->
      <!-- Endpoint URL -->
      <!-- OAuth parameters -->
      <!-- Trust the connection -->
      <!-- Access to the credential store -->
    3. Save.
  4. Copy the and managed-beans.xml file to the server.  For example, /opt/IBM/WebSphere/wp_profile/sbt.  (I created the sbt directory.)
  5. Create two new JNDI URLs in WebSphere.
    1. Access the WAS admin console.
    2. Navigate to Resources -> URL -> URLs.
    3. Select the scope to be your cell.
    4. Select New to create a URL with the following properties:
      1. Name=SBT Properties
      2. JNDI name=url/ibmsbt-sbtproperties
      3. Specification=file:///opt/IBM/WebSphere/wp_profile/sbt/
    5. Select New to create another URL with the following properties:
      1. Name=SBT Managed Beans
      2. JNDI name=url/ibmsbt-sbtproperties
      3. Specification=file:///opt/IBM/WebSphere/wp_profile/sbt/managed-beans.xml
    6. Save.
  6. Restart the “Social Business Toolkit Sample Enterprise Application” under Enterprise Applications in WAS.

Test the Social Business Toolkit SDK

Time to test whether the above steps worked.  To do that:

  1. Go to http://<portal>:<port>/sbt.sample.web/javascript.jsp in a web browser.
  2. Select the Authentication -> Authentication Summary sample.
  3. Click the “Login” button.
  4. You should see a Connections Cloud login screen.
  5. Provide valid credentials and click “Log In”.
  6. The previous “Login” button should change to “Logout”

OAuth Authenticated

What’s Next

Next, we’ll create a simple application with the Script Portlet and the Social Business Toolkit SDK examples.

Building Social Applications using Connections Cloud and WebSphere Portal: Your First Widget

Cognos and Portal Single Sign-Off

A previous post created a single sign-on solution between Cognos and Portal. But what about when the user wants to logoff.  Should the user be logged out of both Portal and Cognos? From discussion with one partner, the answer is, “Yes.”

To create a complete single sign-off solution, we need to:

  • Logoff in Portal – clearing the LTPAToken and session cookies
  • Logoff in Cognos – clearing the CAM passport

Rather than create a custom solution, we’ll delegate the logoff behavior to the respective applications.  This is done by redirecting the user’s browser to the appropriate logoff URL.  For example, when the user clicks “Logout” in Portal, Portal handles the logout behavior and then redirects the user to the logout URL of Cognos.  And vice versa.

There’s a supported mechanism in both products to do exactly this.  See the redirect.logout settings in Portal and the seemingly non-existent Cognos documentation here or user suggestions here.

Configure Portal

  1. Log into Portal’s WebSphere Console.
  2. Access the WP_ConfigService resource environment provider under Resources -> Resource Environment Providers -> WP_ConfigService -> Customer Properties.
  3. Add the following entries:
    1. redirect.logout=true
    2. redirect.logout.url=
  4. Save.
  5. Restart Portal.

You can locate the Cognos “Log Off” URL by simply inspecting the href of the “Log Off” link in the Cognos user interface.

Configure Cognos

  1. Update the \c10_64\templates\ps\system.xml file.
    1. Set <logoff enabled=”true“>
    2. Specify a URL in between the <redirect-url> tags.
  2. Run cogconfig.
  3. Add the domain from step 1B in the IBM Cognos Application Firewall “Valid domains or hosts” setting.
  4. Restart Cognos.

Cognos App Firewall

My system.xml is the following.

 <logoff enabled="true">
 <!-- URL to direct to upon logoff. Note: the 'redirect-url' url specified is subject to validation at runtime -->

Your redirect URL will likely be different.  Again, you can find one by logging into Portal and inspecting the logout link.  The link you see is the current page with an encoded POC action telling Portal to logout.  An alternative suggestion is to create a hidden page in Portal, and use this as the designated redirect URL.  In doing so, you might prevent someone from inadvertently deleting the redirect URL’s page and causing the processes to fail.

Cognos 10.2.2 Single Sign-On

This post should be titled Cognos 10.2.2 SSO déjà vu.  Because in my previous post I figured out how to enable single sign-on with WebSphere Portal and Cognos 10.2.1.  Then 10.2.2 came out, and I needed to figure out how to enable SSO … again.

What’s the Same?

The concepts and practices behind WebSphere SSO are the exact same as my other article.  You’ll still be exporting an LTPA token, using the same registry across servers, and using fully qualified domain names.  You’ll also be configuring cogconfig to set up the external identify mapping.

What’s Different?

By default Cognos 10.2.2 uses WebSphere Liberty Profile.  This actually removes the need to install WebSphere Full Profile, but the WebSphere Liberty Profile lacks the default console application.  So you’ll need to manually edit the server.xml to configure the LDAP registry as well as set up the LTPA token.

Locate the file \cognos\c10_64\wlp\usr\servers\cognosserver\server.xml.  This defines the application server listening on 9300 and running p2pd.  We’ll simply update it to enable LDAP security.  I’ve bold faced the important settings.

<?xml version="1.0" encoding="UTF-8"?>
 Licensed Materials - Property of IBM
 IBM Cognos Products: disp
 (C) Copyright IBM Corp. 2013 2014
 US Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.
<server description="Cognos 10"> 
 <logging consoleLogLevel="ERROR" logDirectory="${install.dir}/logs" messageFileName="p2pd_messages.log"/> 
 <application autoStart="true" id="p2pd" location="${install.dir}/webapps/p2pd" name="p2pd" type="war"> 
 <classloader apiTypeVisibility="spec" privateLibraryRef="p2pd"/>
 <httpEndpoint id="defaultHttpEndpoint" httpPort="9300" host="*"/> 
 <library id="p2pd" apiTypeVisibility="spec"> 
 <fileset dir="${install.dir}/bin" includes="jcam_jni.jar"/> 
 <config monitorConfiguration="false"/> 
 <applicationMonitor dropins="${install.dir}/wlpdropins" updateTrigger="mbean" pollingRate="10s"/>
 <webContainer skipMetaInfResourcesProcessing="true" deferServletLoad="false""true" />
 <executor coreThreads="100" maxThreads="1500" />
 <ldapRegistry id="dominoLDAP" realm="defaultWIMFileBasedRealm" 
 host="" port="389" ignoreCase="true" 
 ldapType="IBM Tivoli Directory Server"
 <ltpa keysFileName="F:\IBM\cognos\c10_64_2\wlp\usr\servers\cognosserver\resources\security\ltpakey.key" keysPassword="{xor}Lz4sLCgwLTs=" expiration="125" />
 <logging traceSpecification="***=all:SASRas=all" />

First, we’re adding the security features using the <feature> nodes.  Then we configure an LDAP registry.  Couple of points on the ldapRegistry.

  • Use your other WebSphere server’s security.xml as a guide. This will be in the profile’s config folder under the cell name.  For example, the realm value will be defined in the LDAPUserRegistry_1 node of the security.xml as will the userFilters and etc.
  • Notice the userFilter uses the ampersand symbol followed by amp semicolon.  This is not an encoding mistake on the blog, it is required.
  • I could not get SSO working with my federated security WebSphere Portal server.  I think this is completely possible, but the system we finally tested SSO on was using a single user registry.  As such, the nature of the changes you see above are all that was needed.

Then you specify the LTPA token.  When you restart the server after adding ldapRegistry, the \cognos\c10_64\wlp\usr\servers\cognosserver\resources\security directory will be created.  Then you can copy the LTPA token file you exported from your WebSphere server.  After you’ve done this, then add the <ltpa> node with appropriate settings.  The keysPassword value can also be plaintext while you are testing.

Save, restart Cognos, and test SSO per my other article.

Something else I found is that the static resources like images are not contained in the p2pd web application by default.  Said differently, you’ll see a bunch of broken images if you go to  In the past, we would have simply re-built the WAR from cogconfig and elected to include the static resources.  Unfortunately there is no build option in cogconfig for Liberty Profile.  And I ended up simply copying the contents of \cognos\c10_64\webcontent into \cognos\c10_64\webapps\p2pd\servlet.  (You’ll need to create the servlet directory.)  There is likely a supported process to follow, but this worked for testing purposes.

We also did not need to create the security roles as documented previously.


In Cognos 10.2.2 you’re really only configuring the Liberty Profile for SSO.  If you’ve done this before in Liberty, you should have Cognos SSO completed in no time.  If you haven’t, read my previous article as well as the Liberty documentation on LDAP registry and LTPA.

IBM Digital Experience 2015 ISV Summary

IBM just finished its annual Digital Experience conference here in Atlanta.  And as usual, I’m analyzing how announcements and observations from the conference enable independent software vendors to craft better software and compete in market.

Content as a Service

Let’s start with a new announcement, Content as a Service.  For many years, IBM Web Content Manager (WCM) has enabled website content creation and consumption. And with the rise of mobile, WCM positioned itself along with then-Worklight, now-MobileFirst as a strategic way to extend the website-investment to a mobile device.  It was a strategy that not only allowed the website to look good on a smartphone or tablet, but also allowed the same website to use the native capabilities of the device.

Content as a Service (CaaS) is strategically the same idea – extend the investment in web content into mobile. But instead of simply visiting a website on a smartphone, content is being discretely used. And that content is likely to be used inside a native application rather than a mobile-friendly web page.  This is done by accessing the data from WCM rather than the final rendered page. It’s a subtle but novel idea.  Let me explain.

Let’s consider that you’ve recently created an awesome “Where to Dine” iOS app. It’s written in Swift, responsive, beautiful – everything that an app needs to be to stand out. Your tech team has loaded up the database with the trendiest spots to eat out, and you’re well on your way to getting 5 stars. But some of the restaurants close and new ones pop up.  And you find your team struggling with having to go back to developers or the-guy-that-runs-the-database to make changes.  What would be nice is if the scout team could make these changes themselves.

To fix this problem, consider using WCM.  WCM extends self-management and control over content to the scout team by:

  • Providing a framework to create easy-to-use interfaces that foster content creation
  • Maintaining access control on content areas or individual records
  • Enforcing approval processes (i.e. workflow) for content
  • Facilitating rollback and auditing

The native iOS app could access content directly from WCM (i.e. Content as a Service).  Or the current design can continue; a custom workflow simply moves published content from WCM to the pre-existing database.  So WCM really augments the solution rather than replacing anything.

DX on Cloud

In another session a business partner described their Portal-based employee intranet. Alone an intranet is nothing new, but they had used the “DX on Cloud” offering to create the underlying Portal. And with DX on Cloud, it took them two days to deploy both Portal and their solution. For anyone that’s done even a small deployment of Portal, it makes you stop and listen.

You can accommodate rapid deployment of Portal software in other ways: partner-led managed service providers, PureApp service on SoftLayer, or virtualization. What makes “DX on Cloud” stand out is that it’s delivered directly from IBM.  And I think we’ll very soon see smaller, economical deployment options [hint].  I’m hopeful that DX on Cloud will give ISVs a single-source to build, run, and manage Portal/WCM applications for both enterprise and mid market.

And for ISVs creating and deploying solutions today, you should be aware of PAAs.  PAAs automate and encapsulate the deployment of your solution.

New Service Oriented Approaches

A service oriented architecture (SOA) is like Portal bread and butter.  Years ago, Portal was the “face” of SOA.  But I’m seeing re-invigoration in the service-oriented approach driven by two enablers: Web Content Manager and IBM Bluemix.

Web Content Manager

The first is Web Content Manager, which has not ostensibly been an application platform.  But WCM has matured quite quickly in recent years, and two new capabilities are giving developers additional options.

One option for developers is the Digital Data Connector (DDC).  The premise is simple; use Web Content Manager to provide a web display for data feeds.  These feeds could be ATOM, RSS, XML, JSON, or even custom.  DDC consumes the feed and provides a way for site designers to access details contained in the feed’s data records.  It’s an integration technique that – owing to WCM – is very flexible in both its creation and final presentation.

Another option is the Script Portlet.  The word “portlet” feels like a misnomer.  It used to be that a Java developer created a portlet and went to IT to be installed.  This took time and effort.  With the Script Portlet what you are really doing is building a web application from your existing website.  The portlet gives anyone the ability add HTML, CSS, Javascript to build a web app.  Developers might find the comparison to JSFiddle useful.  For everyone else, think of a “Contact Us” page that has Google Maps.  A web developer could use the Script Portlet to add a few lines of HTML and Javascript to include a Google Map alongside existing location information.  Furthermore that same Map widget could be re-used in other parts of the site. This approach is the alternative to “enterprise” development.  It focuses on the web developer and a means to keep up with speed of front-end development or line of business needs without sacrificing re-usability and customization.


To explain Bluemix in this post would certainly understate its capability and value. But suffice to say that if I were to launch a new application or startup, I’d begin with Bluemix.  The reason is that Bluemix offers a host of consumable services without the need to install, deploy, and manage.  So when you think about advanced technology like IBM Watson and how it might fit into your application, the answer is Bluemix.  And the framework to integrate and orchestrate Bluemix services into a web application continues to be Portal and Web Content Manager.


The maturity of the underlying Portal platform combined with Web Content Management’s focus on non-technical development gives ISVs an edge in the market.  Sure you can still continue to write enterprise [Java] code.  But are you obligated to do that?  No.  Think about how that not only enables your development team but also how that will enable your customers. And as IBM continues its march to as-a-Service models for both deployment (DX on Cloud) and consumption (Bluemix), the value proposition of ISV solutions in market just keeps getting better.