wissel.net

Usability - Productivity - Business - The web - Singapore & Twins

Postman and the Salesforce REST API


The Salesforce API is a great way to access Salesforce data and can be used with tools like SoqlXplore or the Salesforce Workbench. The API uses OAuth and a Bearer Authentication, so some steps are required to make that work in Postman

Prepare Salesforce

You will need a connected APP. I usually create one that is pre-approved for my user profile(s), so I don't need to bother with the approval steps in Postman. However you could opt for self-approval and access the app once to approve its use, before you continue with the command line. Note down the ClientId and ClientSecret values.

Prepare Postman

Postman has great build in support for all sorts of authorization interactively. However my goal here is to fully automate it, so you can run a test suite without manual intervention. First stop is the creation of one environment. You can have multiple environments to cater to different Salesforce instances.

Important Never ever ever store the environment into version control. It would contain credentials -> bad bad idea!

My environment variables look like this:

{
	"CLIENT_ID" : "the ClientId from Salesforce",
	"CLIENT_SECRET" : "The ClientSecret from Salesforce",
    "USER_ID" : "some@email.com",
    "PASSWORD" : "DontTell",
    "LOGIN_URL" : "https://login.salesforce.com/"
}

Providing the Login URL allows to reuse postman collections between Sandboxes, Developer Orgs or Production Orgs without the need to actually edit the postman entries. Next on the menu: getting a token


Read more

Posted by on 2018-07-06 05:50 | Comments (1) | categories: Salesforce Software WebDevelopment

Mime is where Legacy Systems go to die


Your new system went live. Migration of current, active data went well. A decision was made not to move historic data and keep the old system around in ?read-only? mode, just in case some information needs to be looked up. Over time your zoo of legacy systems grows. I'll outline a way to put them to rest.

The challenges

All recent systems (that's younger than 30 years) data is stored more or less normalized. A business document, like a contract, is split over multiple tables like customer, address, header, line items, item details, product etc.

Dumping this data as is (csv rules supreme here) only creates a data graveyard instead of the much coveted data lake or data warehouse.

The issue gets aggravated by the prevalence of magic numbers and abbreviations that are only resolved inside the legacy system. So looking at one piece of data tells you squid. Only an old hand would be able to make sense of Status 82 or Flags x7D3z

Access to meaningful information is confined to the user interface of the legacy application. It provides search and assembly of business relevant context

The solution approach

Solving this puzzle requires a three step approach:

  • denormalize
  • transform
  • make accessible

Read more

Posted by on 2018-06-22 01:40 | Comments (1) | categories: Software Technology

Adventures in TDD


There are two challenges getting into TDD:

  • Why should I test upfront when I know it fails (there's this massive aversion of failure in my part of the world)?
  • Setting up the whole thing.

I made peace with the first requirement using a very large monitor and a split screen, writing code and test on parallel, deviating from the ?pure teachings' for the comfort of my workflow.

The second part is trickier, There are so many moving parts. This post documents some of the insights.

Testing in the IDE

TDD has the idea that you create your test first and only write code until your test passes. Then you write another failing test and start over writing code.

As a consequence you need to test in your IDE. For JavaScript or Java that's easy (the languages I use most):

  • In JavaScript you define a script test in your package.json you can run any time. For a connoisseur there are tools like WallabyJS or VSCode Mocha Sidebar that run your tests as you type and/or save. The tricky part is: what testing libraries (more on that below) to use?
  • In Java Maven has a default goal validate and junit is the gold standard for tests. For automated continuous IDE testing there is Infinitest
  • For Salesforce you have a combination of JavaScript and Apex (and clicks-not-code), testing is a little trickier. The commercials IDE TheWelkingSuite and Illuminated Cloud make that a lot easier. How easy is in they eye of the beholder. (Honorable mention: JetForcer - I simply haven't tested that one yet)

Testing in your Continuous Integration

Automated testing, after a commit to Github, GitLab or BitBucket happens once you configure a pipeline as a hook into the repository and have tests specified the pipeline can pick up. Luckily your maven and npm scripts will most likely work as a starting point.

The bigger challenge is the orchestration of various services like static testing, dependency management and reporting (and good luck if your infra guys claim, they could setup and run everything inhouse).

Some of the selections available:


Read more

Posted by on 2018-06-10 01:43 | Comments (1) | categories: JavaScript Salesforce TDD

What really happens in OAuth


OAuth in its various versions is the gold standard for Authorization (and usingOpenID Connect for Authentication as well). There are plenty of introductions around explaining OAuth. My favorite HTTP tool Postman makes it really simple to obtain access via OAuth.

Nevertheless all those explanations are quite high level, so I wondered what happens on the wire for the getToken part so I started digging. This is what I found. Nota bene: There is no inherit security in OAuth if you don't use https.

The components

  • Authorization server: server to interact with to get an authorization
  • Client identifier (ClientID): ?userid? of the application
  • Client Secret: ?password? of the application
  • A user

I'm not looking at the Resource Server here - it only comes into play before or after the actual token process.

The Form-Post Flow

There are several flows available to pick from. I'm looking at the Form-Post flow where user credentials are passed to the authentication server to obtain access and refresh tokens.

For this flow we need to post a HTTP form to the authorization server. The post has 2 parts: Header and body. A request looks like this:

POST /yourOAuthEndPoint HTTP/1.1
Host: authserver.acme.com
Accept-Encoding: gzip, deflate
Accept: *.*
Authorization: Basic Y2xpZW50aWQ6Y2xpZW50c2VjcmV0
Content-Type: application/x-www-form-urlencoded
Cache-Control: no-cache

grant_type=password
  &username=user%40email.com
  &password=password
  &scope=openid+email+profile
  &client_id=clientid

Some remarks:
- The Authorization header is just as Base64version of clientid:clientsecret - you have t replace it with your actual info
- Content-Type must be application/x-www-form-urlencoded
- The body is just one line with no spaces, I split it only for readability
- scope is a encoded list the + signs are actually spaces. Keeping that in mind you want to keep the server side scope names simple
- You need to repeat the clientid as header value

As a result you get back a JSON structure with authorization information. It can look like this:

{
    "access_token": "wildStringForAccess",
    "refresh_token": "wildStringForRefreshingAccess",
    "token_type": "Bearer",
    "expires_in": 300
}

The result is easy to understand:
- expires_in: Duration for the access token in seconds
- token_type: Bearer denotes that you call your resource server with a header value of Authorization: Bearer wildStringForAccess

As usual YMMV


Posted by on 2018-06-04 03:16 | Comments (0) | categories: Software WebDevelopment

Reuse a 3rd Party Json Web Token (JWT) for Salesforce authentication


The scenario

You run an app, could be a mobile native, a SPA, a PWA or just an application with JavaScript logic, in your domain that needs to incorporate data from your Salesforce instance or one of your Salesforce communities.

Users have authenticated with your website and the app is using a JWT Bearer Token to establish identity. You don't want to bother users with an additional authentication.

What you need

Salesforce has very specific requirements how a JWT must be formed to qualify for authentication. For example the token can be valid only for 5 minutes. It is very unlikely that your token matches the requirements.

Therefore you will need to extract the user identity from existing token, while checking that it isn't spoofed and create a new token that you present to Salesforce to obtain the session token. So you need:

  1. The key that can be used to verify the existing token. This could be a simple String, used for symmetrical signature or an X509 Public Key
  2. A private key for Salesforce to sign a new JWT (See below)
  3. A configured Connected App in Salesforce where you upload they full certificate and obtain the Consumer Key
  4. Some place to run the code, like Heroku

Authentication Flow for 3rd party JWT


Read more

Posted by on 2018-05-03 10:42 | Comments (0) | categories: Heroku Salesforce

Function length and double byte languages


Complexity is a prime enemy of maintainability. So the conventional wisdom suggests methods should be around 20 lines, with some evidence suggesting up to 100+ lines.

When I review code written by non-native English speakers, especially when their primary language is double byte based, I find methods in the 500-1000 lines range, with some special champions up to 5000 lines. So I wondered what might contribute to these function/method worms.


Read more

Posted by on 2018-04-09 10:18 | Comments (1) | categories: Java JavaScript NodeJS Software

Creative logging with $A.log()


In Lightning applications there are two ways to log: console.log(..) and $A.log(...). This has led to some confusion what to use.

The official statement: $A.log() will eventually go away, use console.log()

This is a real pity, since $A.log() is quite powerful and closer to what a developer would expect from logging. One reason for its demise: in a production setting $A.log() would output - nothing. There's no official documentation how to change that and the $A.logger.subscribe(...) method is neither documented nor guaranteed, only mentioned on Stack Exchange. So?

Enjoy it while it lasts

The simple case to activate console output in production is to add a helper function that can be triggered by a button or whatever you find necessary:

$A.logger.subscribe( "INFO", function( level, message, error ) {
                                console.log( message );
                                console.log( error );
                             });

Instead of sending output to the console, which could confuse users seeing all that ?tech' stuff, you could redirect it into a custom component (the following snippet fits into an onInit script):

var target = component.find("loggerlist").getElement();
$A.logger.subscribe( "INFO", function( level, message, error ) {
                               target.innerHTML += "<li>"+message+"</li><li>"+error+"</li>";
                             });

The target element would be <ol auraid="loggerlist"> so you get a running list.

Across the network

One is not limited to staying on the same machine. With a few lines of code logging can happen on a remote location as well. The following shows logging using websockets. For a production run (e.g. permanent instrumentation) I would make it a little more robust, like keeping the connection open and check if it is still there or send JSON, but for the occasional support this is good enough:

$A.logger.subscribe( "INFO", function( level, message, error ) {
    var wsEndPoint = 'wss://somewebsocket.url/ws/log';
    var connection = new WebSocket(wsEndPoint);
     connection.onopen = function(event) {
        connection.send(message);
        connection.send(error);
        connection.close();
    };
});

I'll show a potential receiving end implementation in a future post.
As I said: enjoy it while it lasts, it might go away soon. YMMV


Posted by on 2018-04-03 02:50 | Comments (0) | categories: Salesforce

Salesforce one year on


A year ago I said Good by IBM, Hello Salesforce. A lot has happened in the last 12 month. Salesforce is only my second salaried job, I've been running my own companies and been freelance before.

Coming from IBM, where Resource Actions had efficiently killed employee engagement, Salesforce's Ohana culture was a refreshing different. It makes such a difference to work with people who are genuinely interested in your success, without exception. In summary:

  • I became a Trailblazer Ranger, completing 30 trails, 206 badges and collecting 169625 points
  • Passed five Salesforce certifications
  • Contributed to customer success in Singapore, Australia and Korea
  • Wrote 25 blog entries (Way to little, more are coming)
  • Moved my blog from Domino to git (more on that below)
  • Contributed to OpenSource on github:
    • Maintainer for node-red-contrib-salesforce. The nodes that connect NodeRED to Salesforce, including the support for platform events
    • Excel2XML: Tool that converts XLSX tables into XML, so data can be extracted in command line applications. Main purpose is to make Excel data accessible in build pipelines (e.g. sample values for tests)
    • Spring Boot and Salesforce Canvas: Sample application that turns a Canvas POST into a JWT authentication, so classic multi pages applications can be integrated into Salesforce Canvas
    • Vert.x proxy Filtering proxy implemented in Apache vert.x. It allows to front a web application and filter HTML, JSON etc. based on content and URL
    • SFDC Platform Events: Modules for Apache vert.x to connect to Salesforce. It includes authentication and processing of platform events. This allows for high performance multi-threaded interaction with Salesforce APIs, not limited to platform events
    • Blog Comments Tool that accepts a JSON formated comment structure and creates a Bitbucket file, a commit and a pull request. Allows for a database free comment engine
    • BlogEngine: The application that powers this blog. It generates static files when commits/merges happen to my master branch on Bitbucket

What a ride, onto year two!


Posted by on 2018-04-01 12:50 | Comments (2) | categories: Salesforce

Boolean to get major overhaul


George Boole didn't seem to understand his five teenage daughters, (he didn't have sons, so this is about teenagers, not daughters) otherwise his boolean logic would encompass not only true and false, but also maybe or don't know. Luckily that omission will be addressed now.

Boolean to merge with Ternary

Quick recap: a boolean value has the values true (usually 1), false (usually 0). Ternary has 3 states, typically denoted -1, 0, 1. Not to confuse ternary with QBits which are true and false at the same time.

To reflect the real world, where nothing is certain, and cater to teenage level developers, the ternary and boolean data types will be merged into a new type: RealBoolean.

Proposals are under way to incorporate RealBoolean into all major programming languages ASAP. RealBoolean will have the values true, undecided and false. While it is up to the programming languages how these values are represented, consensus is, that the most likely candidates are -1, 0 and 1.

New hardware

Like specialized mining hardware for Crypto, RealBoolean will benefit from purpose build ternary computers. Early models had been running since 1958. Ternary computing also has arrived in micro processor architectures. Of course there are doubters

Transition period

Having multiple data types to express the truth might fit the political desire for alternate facts, but is an unsustainable confusion in programming. Therefore the classic boolean values will become illegal April 01, 2042.
In the transition period classic booleans will be ducktyped into RealBoolean whenever the values true, false or 1 are used. For boolean 0 or -1 (as some unfortunate languages use) compilers and runtimes are mandated to issue a warning for the first 5 years, thereafter a stern warning before they finally become illegal

Enforcement

All version control repositories will be scanned (the NSA does that anyway) and offending code flagged with new issues. Binary code, not compiled from a repository, will be treated as virus, blocked and deleted. After the deadline all remaining offending code will be transpiled into COBOL - good luck with finding developers to make sense of that code thereafter


Posted by on 2018-04-01 12:50 | Comments (4) | categories: After hours Software Technology

Authenticate from Salesforce to Heroku with JWT


Heroku PAAS is an easy and flexible way to extend Salesforce functionality.
Its easy to call out to a Heroku REST service build with a method of your choice: Java, JavaScript, Ruby etc.
The usual sticky point between two platforms is the identity of the caller.

In a lot of cases the only requirement is ?a valid Salesforce user?, eventually with some additional claims added.
For this scenario a JWT Bearer authentication (RFC 7519) is a good pick.

When you look around, most examples revolve around a JWT token being issued after some means of authentication from the same system that was authenticated against.
This scenario is different: The Salesforce server issues the JWT token and code on Heroku validates the token as authorization.
The beauty of the approach: no extra calls need to be made to get this working.

What you will need:

  • A valid certificate, in this case self signed is good enough (it won't be used by a browser)
  • OpenSSL installed on your computer
  • A Heroku account

Preparing the certificate

In Salesforce setup open Certificate and Key management.
Create or pick a key. Note down the name. For this article I will use YourCertNameHere. Open the key and click on Download Certificate.
The cert will be downloaded in crt format. We use this file to extract the public key that Heroku will use to verify the signature. To get the key use:

openssl x509 -pubkey -noout -in YourCertNameHere.crt

The String we need is between the BEGIN and END lines, excluding the lines. You can store it into a file or create a Heroku environment variable.
Since it is a public key, you don't need to guard it as fiercly as your private keys.

The APEX code

Before making a call-out to Heroku, you need to componse the signed JWT token. That's done in just a few lines:

public static String getJWTBearer(String subject, String keyName) {
    Auth.JWT jwt = new Auth.JWT();
    jwt.setSub(subject);
    Auth.JWS myJws = new Auth.JWS(jwt, keyName);
    return myJws.getCompactSerialization();
}

You might opt to add additional claims beside the subject, when your use case does require that.
The header value gets added to the Authorization header as Bearer authorization.

Java code

I'm using the jjwt library which is available on Maven Central.
It makes it simple to retrieve a claim. An expired claim or an invalid signature will throw an error, so wrap it into a try/catch.

import java.security.Key;
import java.security.KeyFactory;
import java.security.spec.X509EncodedKeySpec;
import java.util.Base64;

import io.jsonwebtoken.Claims;
import io.jsonwebtoken.Jwts;

public class JwtVerifier {
	public static Claims getClaims(String key, String token) throws Exception {

		byte[] byteKey = Base64.getMimeDecoder().decode(key);
		X509EncodedKeySpec X509publicKey = new X509EncodedKeySpec(byteKey);
		KeyFactory kf = KeyFactory.getInstance("RSA");
		Key realKey = kf.generatePublic(X509publicKey);
		return Jwts.parser().setSigningKey(realKey).parseClaimsJws(token).getBody();
	}
}

The only catch in the code was the need for MimeDecoder instead of a standard Decoder for Base64 decoding.
The subject, typically the user, can be retrieved using claims.getSubject()

Next stop, for another blog entry: the NodeJS equivalent.

As usual YMMV!


Posted by on 2018-03-23 04:40 | Comments (0) | categories: Heroku Java Salesforce