wissel.net

Usability - Productivity - Business - The web - Singapore & Twins

By Date: December 2010

Simplify Domino installation on Ubuntu


Installing Domino on Linux requires a few extra steps. You need to create user/group and implement a startup script. The best one can be found on Daniel Nash's site. A good instruction, step by step has been provided by Danilo Dellaquila in Part 1:Ubuntu installation and Part 2:Domino installation. Danilo went the extra mile and created a deb package for download that takes care of the details and simplifies your chores. Details are in his wiki.
As usual: YMMV

Posted by on 30 December 2010 | Comments (5) | categories: IBM Notes Linux Lotus Notes

Uncle Pitt promised us a robot, but what is this?


Escaped from snowy Europe my old friend and partner Peter de Waard visited us for XMas in Singapore. To get the boys exited we announced that Pitt will bring them a real robot. In disbelieve they looked at the box after ripping it open:
CT Robot assembly kit
Luckily curiosity isn't in short supply when you are 10 years old and the world is your cupcake. After seeing where it might end they started bogging their grandfather to teach them how to solder (luckily he is a retired electrician) and are looking forward to bring that CT Bot to live. They even might pick-up some German along the way.

Posted by on 30 December 2010 | Comments (1) | categories: After hours Twins

Roll your own burncharts


Burn charts are an important communication medium to make the status of a project transparent to the users. Instead of showing the useless % complete (useless since the measurement base for 100% is a moving target) burn charts show how much work is left. I have advocated them before. A burn chart allows to visualise the impact of a change request. In my sample graphic it is a vertical red bar.
See what is really happening in your project over time!
I've been asked how I created the samples and I have been suspected of MS-Excel, Symphony, Paint and Gimp. I used none of them. What is needed is what one has on a current Domino server: Dojo. I used the dojox.gfx graphic framework. To draw a graph one just needs to call burnChart("divWhereToDraw",widthInPix,heightInPix,DataAsArrayOfArrays, RemainingUnitsOfWork, displayCompletionEstimateYesNo); where data comes in pairs of UnitsWorked, UnitsAddedByChangeRequests. Something like var DataAsArrayOfArrays= 10,0],[20,0],[20,5],[20,30],[20,0. It is up to you to give the unit a meaning. The graphic automatically fills the given space to the fullest. If RemainingUnitsOfWork is zero it will hit the lower right corner exactly. I call my routine from this sample script:
function drawMyBurncharts ( ) {
  var series   = [ [ 20 , 0 ] , [ 20 , 10 ] , [ 10 , 30 ] , [ 10 , 0 ] ,   [ 30 , 0 ] ,    [40, 20], [20, 0]];
  var series2 = [[10, 0], [10, 0],  [10, 20], [10, 40], [20, 40], [20, 40], [20, 20]];
           
  burnChart("chart1",800,400,series);
  burnChart("chart2",200,100,series);
  burnChart("chart3",1400,800,series);
  burnChart("chart4",1400,800,series2, 200, true);
}
           
dojo.addOnLoad(drawMyBurncharts);
The whole function is rather short and is visible in the source view of the example page (that's not an image, so it could be extended into drill downs).

Now someone needs to wrap that into a XPages custom control.

Posted by on 29 December 2010 | Comments (1) | categories: Software

Technical Debt


One of the publications that gives you a good overview what is happening in the world of software development is SD Times. Their parent company BZ Media conducts physical and virtual events. Todays invitation rose my interest since it was titled " A Technical Debt Roadmap". From the description:
" "Technical Debt" refers to delayed technical work that is incurred when technical short cuts are taken, usually in pursuit of calendar-driven software schedules.
Technical debt is inherently neither good nor bad: Just like financial debt, some technical debts can serve valuable business purposes. Other technical debts are simply counterproductive. However, just as with the financial kind, it's important to know what you're getting into.
"
Go sign up, the speaker is Steve McConnell, the author of Code Complete . Having grown up in conservative Germany (yes I'm Bavarian) the concept of "Schulden haben" (having debt) had the stigma of failure and loose morals and I'm very concerned when looking at the required interest payments when it comes to debt. A very German engineering trait is that "things need to be done right" rather than "good enough". It is the foundation of successes like Daimler, Audi or BMW. In a nutshell it is the idea that on delivery a piece of work must be debt free. This doesn't translate very well into software, which by the nature of facts can't be debt free. So it is always a balancing act. Ever increasing complexity and growing interdependency seem to favour greater amounts of debt: "Delivery is now, the fixpack is later". However there are different kinds of debt, like in the financial world: your mortgage (hopefully) backed by an asset, your consumption loan, your credit card balance and (hopefully not) that open item from the loan shark in the casino. The equivalent in software would be (in no particular order):
  • Hardcode debt: system names, maintenance passwords, URL settings etc. are fixed values in an application. Debt is due if any of these needs to be changed. Typically can be found in organisations where creation and maintenance of applications are strictly separated or the development team lacks the experience
  • Broken code debt (the classical bug): something doesn't work as expected. Usually gets paid once the bug surfaces and a big enough customer complaints
  • Missing feature debt: Gets paid in a following version - if there is one
  • Missed expectation debt: Sales, marketing or management make promises or announcement that don't get realised in the current code stream. Close relative to the "missing feature debt". Typically that debt is met with product management denial: "How much more do I sell if I implement that?" Denial since the one who created the expectation already "spent the money" and failure to service this debt will lead to loss of repudiation, credibility, trust and ultimately customers and revenue
  • Non-Functional debt: Software does what it should but misses non-functional requirements like robustness (against failure) or resillience (against attack) or integration capabilities. Expensive debt (it would call it the credit card balance of software) since the discovery of resillience flaws often leads to (expensive) emergency actions and the lack of integration capabilities leads to rewrites or addition of a lot of middleware. Also it is difficult to spot since more and more software is business funded where there is little understanding for non-functional traps
  • Bad code debt: The equivalent of a loan shark arrangement. Looks OK on first view, the software works at delivery (the night in the casino wasn't disrupted by lack of funds). Usually payment is limited to parts of the interest (patch it up baby). Leads to abandoning of systems (declared bankruptcy). Typically that debt is hidden as long as possible (who wants to admit to owe to a loan shark) to then take everybody "by surprise".
Paying technical debt requires real money. It is paid back with engineering hours someone needs to pay (if you can find the right talent). As with all debt: if you only serve the interest but not the principal the debt doesn't go away and at some point the interest payments will be more than the principal. Something a myopic quarter to quarter view might overlook.

Posted by on 28 December 2010 | Comments (2) | categories: Software

Unmasking Parasites - Does your website host malicious content?


Every web server, Domino included, serves files from its file system. So once someone gains access to that file system unwanted content can be deposited there. If your content is served out of files (or you don't watch your js files carefully) it is just a small step to serve maleware. The prevalence of FTP and/or weak access security makes this rather easy. So from time to time you want to check if your server (or any other destination) serves something you rather don't want to digest. Unmask Parasites does that for you. I added a link on the bottom left to check this site labeled Site Security Check
As usual YMMV

Posted by on 28 December 2010 | Comments (0) | categories: Software

From LotusScript to ServerSide JavaScript - what do you want to see?


I'm polishing my AD103 "Don't Be Afraid of Curly Brackets: JavaScript for IBM LotusScript Developers" presentation. It's packed with information to get you started. Given the fact that a session has limited time I need to narrow down the live code examples I will show. The all over topic is " This is what you are used to in LotusScript, this is how you do that in SSJS" (On a conceptual level where 1:1 is a misfit).
So I wonder what examples would benefit you most? What would you like to see? Keep in mind the focus is on the language, less on the objects (we have a lot of XPages sessions in 2011).

Posted by on 27 December 2010 | Comments (8) | categories: Lotusphere

Adminstration rules #2, #4, #21


Sriram asked for the rationale behind rules #2, #4, #21 of the Golden Rules for Domino Adminstrators. So here we go. These admin rules are designed to protect from trouble. Breaking them requires that you know what you are doing, which is quite different from thinking you know what you are doing. So unless you have three good reasons (one is not enough, also two is not sufficient, three is the number) stick to them.
  1. Never ever use operating system tools to manipulate (copy, delete, rename or create) Domino databases. This includes using FTP to move databases to other locations!
    When you touch databases with operating system tools you expose yourself to unnecessary risk. You might be logged in as a different user than the Domino task and break file attributes (owner/permission). When you copy a NSF you actually create a replica - and 2 replicas on one server is forbidden (it definitely screws up DAOS). You also might end up to copy a file out of reach. Database deletion/rename is an adminp task so links can get adjusted. When you FTP a NSF you copy the ODS and eventual containing problems (beside the file access rights). With replication only documents get transmitted - if that doesn't work you have a problem you need to fix anyway. Furthermore you risk to try operations while a Domino is running and corrupt the databases in the process. So all risk, no rewards
  1. Always use the server console for unscheduled replications. Never use the client replicator page for that
    The replicator page runs from the user workstation with the user rights. You will route the replication traffic through the client network. You will screw up replication formulas (if there are any) and you won't discover a connection issue between the 2 servers. You also block your own replicator. All disadvantages, no rewards
  1. Never to use backup copies of ID files when users forget their password. Use IDVault and password recovery
    The fact that an admin has access to a backup copy of an ID file including a password invalidates the WHOLE security model since an admin can decide to impersonate anyone. It is sloppy security not worth to be called that. Also old copies might have old keys. Handling id files is staff intensive. So: All work and risk, no rewards
    Update: As Manfred pointed out in the comment: includes the risk of permanent data loss when document encryption keys are in that id.
Of course that's just a very compressed summary. Keep in mind: We must all face the choice between what is right and what is easy
As usual: YMMV

Posted by on 23 December 2010 | Comments (2) | categories: Show-N-Tell Thursday

Golden Rules for Domino Adminstrators (courtesy of Manfred Meise)


Manfred Meise of MMI Consult has compiled a list of Golden Rules/ Administrative Pledges for Notes/Domino administrators. To save English readers the trouble of Google English I've translated (and amended) them for you:

The trouble free operation of a Notes/Domino infrastructure requires attention to (and observance of) a few basic rules. Or put it different: Ignoring or violating these rules is asking for trouble. The list doesn't claim to be complete and will (based on sad experience in projects) be extended in future.

As as responsible Lotus Domino Administrator I pledge the following:

  1. Never ever allow two replicas of a database to be stored on the same server
  2. Never ever use operating system tools to manipulate (copy, delete, rename or create) Domino databases. This includes using FTP to move databases to other locations!
  3. Never restore a Notes database directly to the server where it came from (unless the original is gone) - See also #1
  4. Always use the server console for unscheduled replications. Never use the client replicator page for that
  5. When using the transaction log always use a physically separate disk with sufficient space
  6. When using "archiving" transaction logging, ensure that the backup software does support this
  7. To create databases from templates always use File -Application... New.. Never copy or rename files on the OS level
  8. When creating DIR links always point them to destinations outside the Domino data directory. Never point two dir links to the same destination
  9. Neither install a Notes client on a Domino server, nor run the client part of the Domino server
  10. Set "Use operating system time" (including DST) on all clients and servers
  11. Ensure a proper backup. Replication (even cluster replication) is no replacement for backup
  12. Keep the servers running especially at night, so the house keeping functions can run
  13. Remove replicas from servers that have been offline for extended periods of time (e.g. 6 weeks) and fetch new replicas
  14. Replicate all system databases at least once a day (that includes: ensure the system databases are replicas)
  15. Never grant manager access to servers from other domains
  16. Always ensure system templates are the highest release of Domino servers in use. Templates are backwards, not forward compatible
  17. When using OS level virus scanners exclude at least the data directory. In a multi-user (client) install exclude all data directories of all users, at least *.nsf, *.ntf, *.box
  18. Every Domino server will be protected by a Domino aware virus scanner that can check the database content including the attachments
  19. Only run the server tasks that are in use and ensure proper configuration (keep in mind: some tasks like statistics and adminp are not "used" by users but still relevant)
  20. Never ever delete a person document when user leave. Always use the adminp process to ensure complete removal and addition to a deny-access group
  21. Never to use backup copies of ID files when users forget their password. Use IDVault and password recovery
  22. Ensure that users' workstations have the current set of drivers (first and foremost video) installed
  23. Fight disk fragmentation by regularly defragmenting Notes and Domino program and data drives
Thank you Martin!

Posted by on 21 December 2010 | Comments (6) | categories: Show-N-Tell Thursday

Notes / Domino upgrade cheat sheet


Over time I've written a number of entries related to Domino upgrades, Domino administration and helped many customers to succeed in implementing the latest versions. This entry summarizes what works and provides links to the relevant articles and information serving as a convenient entry point. It is linked in the sidebar and I will keep it current.
  1. Make sure you know how the ideal Domino upgrade looks like
  2. You need a plan, a project plan. Head over to MindPlan and get your version of a mindmapping cum project planning Notes composite application
  3. First make yourself knowledgeable by browsing the IBM Fixlist. It is always a good idea to know what is coming
  4. If you plan to jump versions, you should be on the latest patch level of your current version. IBM Fix Central provides the latest patches to get you there
  5. Visit Upgrade Central to get the latest recommendation
  6. Make sure you build a high performance Domino server for best user experience
  7. If that server is a new one, you can minimise your downtime when upgrading
  8. Revisit you Replication and Routing Architecture. It might be just fine, it might need some TLC
  9. If your Domino server runs on Windows you need to take care of disk fragmentation. Use OpenNTF's DominoDefrag or the commercially supported DefragNSF. If you run Notes clients on Windows, fragmentation is an issue there too. DefragNSF has a client module to take care of that
  10. When rolling out your client upgrades you need to rethink your installation location and installation type (I strongly advocate shared install), especially when you want to be Windows7 compliant. Smart Upgrade can't change location or type, so you want to invest into the Panagenda Marvel Client or the BCC Client genie for upgrade, roaming and management
  11. Disk fragmentation is an issue on your clients too. The minimum you can do about it is to incorporate a defragmentation run while installing using the free MyDiskDefrag. Better you look for continous defragmentation. Both DefragNSF and the Panagenda Marvel Client support continuous client defrag. You can also consider to use MyDefragGUI which provides a screen saver that defrags whenever your machine is idle (Added March 09, 2011)
  12. Revisit your eMail retention policies. Once you understand the difference between Backup and Archival, you might consider an eMail life cycle solution like the iQ.Suite
  13. If you would like to optimise your infrastructure and benchmark it against thousands of other installations, give TrustFactory a call and schedule a Health Check
  14. To improve the quality of web mail (both your use of webmail and the Internet eMail going in and out) consider Geniisoft's iFidelity plug-in for your Domino server
  15. While cleaning out your installation you might want to bring some order to your group names. It saves a lot of administrative time. To switch your user administration into auto-pilot mode consider HASDL FiRM, the Federated identity and Resource Management
  16. Give your users the opportunity to share ideas and deploy IdeJam by Elguji
  17. Last not least. Make your users more productive! Adopt GTD (after you read the book) and deploy the eProductivity template
And never forget why you are on Domino and what happens when others try to move you away (even Accenture struggles - andd the Internet never forgets).
Just to be clear: this post mentions several commercial offerings from IBM Business Partners. You need to evaluate for yourself how an investment into these offerings makes sense for you.
As usual YMMV

Posted by on 18 December 2010 | Comments (1) | categories: Show-N-Tell Thursday

Linux Foundation is getting traction in China


Linux is getting traction in China despite the fact that you can buy a (pirated) Windows CD for a dollar a pop. There is Red Flag, a Chinese incarnation of Fedora and a trend for more large Chinese companies to join the Linux Foundation. In November China Mobile Communications Corporation (CMCC), the world largest mobile operator joined the foundation as Gold Member followed by Huawei, the makers of network gear and mobile devices (We all love the E5) joining in December. This highlights 2 interesting trends: Huawei and CMCC are both Telecom players. Huawei builds networks, CMCC runs networks. Secondly China is striving to transform from the world's workbench to the world's R&D centre. Judging from my interaction with my Chinese colleagues I would say: they are on the right trajectory. So is it now time to brush up your Chinese and join the Linux Foundation yourself and time for IBM to revisit Domino Designer on Linux?

Posted by on 15 December 2010 | Comments (0) | categories: Linux

World of Warcraft - Improvements on family accounting needed.


The gentlemen have discovered WoW and were bogging me to get them an BattleNet account, so they can play get access to a strategy and leadership training ground. BattleNet requires a parent to have an account too in order use the parental controls. So far so good. I got an eMail with the URL to access the pages that allows me to limit what they can do and how long they can play, per day and/or week and when. I also can opt into a weekly play statistic. So that works rather well. Only downside: you have to keep the eMail or bookmark the page, since your BattleNet account won't reflect your parental status and show the URL to you. From there it went downhill.
When I created my account I registered the game serial number in my account. I learned very quickly, that that wasn't a good idea. So I contacted helpdesk through their online form only to get a reply: Sorry we only handle Starcraft for SEA, please go ... and then the link to exactly the form I just had filled. Lousy service. After another round of forms/eMail I finally got the right eMail to the US helpdesk. The replies to my questions very badly formatted standard answers, that just missed the point. I at least would expect cookie cutter answers to be more appealing. Finally I found out: To transfer that to one of my sons I have to go through hoops and loops including faxing identity documents. I haven't seen a fax for years - and I won't send a private company and of my identity documents. My second question: Can I buy a subscription for my sons, pay it with my credit card in my BattleNet account. Turns out that isn't possible. I would have to enter my credit card details in their account, where I have no control (I encourage them to have their own passwords)- another fail. To clarify this took 3 rounds. The only alternative is to find an retailer and to buy battle cards. Every contact with support was followed by an "rate how we doing, click here" email.
In summary: WoW fails:
  • Online support is confusing
  • Account transfer between family members is overly complicated
  • Parental controls are functional but could be easier to access (and why not have a playtime dashboard instead of a weekly eMail)
  • Family payment option (from a master account) is missing (That also might work for companies as a perk for the Geeks)
Guys you can do better!

Posted by on 12 December 2010 | Comments (1) | categories: After hours

How to setup functional network based internet parental control?


I'm firmly in the camp " Teach kids to swim rather than keep them away from the water". Nevertheless due to strong suggestions from SWMBO I'm looking into setting up parental controls. Being the geek dad my requirements tend to be more complex. First of all we have a zoo of internet capable devices: PCs (Windows, Linux), Mac, iPhone, PSP, WII, DS and even a OLPC. The gentlemen switch between all these devices. On the PCs and Macs they do have their own accounts. So a parental control needs to be implemented on a router level (I would be OK with dd-wrt or OpenWRT.The general idea is to have a time and duration based access. So based on school day / weekend / holiday there would be a time from to where the internet is open. There would be also a time limit for total access (e.g. the Internet would be open for 12h/day but total time might be 2h only). On top of that there would be a list of constraints and extensions:
  • Obviously there is a permanent black list. I don't want them to surf to adult content
  • There is a permanent white list. Places like The Khan Academy, OpenCourseWare Consortium, the MIT, their own School, The yellow bubble or comparable sites would be available any time (even outside the core internet time) and would not count against the time limit.
  • There is a "gray list" that list sites (like battle.net) that are only accessible during "off hour" periods. Off hour would be a time frame inside the general opening hours (or its own time slot if that is easier to implement). Gray listed items could have their own time limits
  • For the devices with user accounts I would expect no authentication prompt but an automatic propagation of identity (am I dreaming here?) from Win, Mac, Linux. For the devices without user identity (iPhone, WII, DS, PSP) a authentication prompt would be required (more dreaming?) - or alternatively: have their own Ethernet-ID bases access control
  • Special challenge: a lot of site (like the Khan Academy) store content on YouTube. But YouTube also hosts content like the annoying Orange. How to allow the former and block the later?
  • Bonus challenge: have a request mechanism where they can click and request an URL to be added to the permanent white list
What would be a workable approach to implement this?
P.S.: Now If I could control the TV as part of the setup (I rather have them surfing or playing WoW that couch potatoing.

Posted by on 12 December 2010 | Comments (5) | categories: After hours

Rendering your own output in XPages revisited


My blog post Web Agents XPages style proves to be particularity popular with a lot of questions coming in. There are a few clarifications that I'd like to share:
  • You can have one: either the Writer (for text) or the Stream (for binary content - sample here). You actually might want to opt for the stream even for textual content depending how you create it (see below)
  • You call the faceContext.getResponseWriter() in the afterRenderResponse event, while you call the facesContext.getOutputStream() in the beforeRenderResponse event (remember only one please!)
  • It seems you don't need the writer.close(); at the end. Starting with 8.5.2 this actually throws an error
  • A smart way to render output is actually a Java bean that takes a stream as a parameter. Such a bean can be fully tested and debugged outside the XPages context
  • Very often the use of your own content is to output XML or HTML or a combination of both. In LotusScript you simply use Print, so the temptation is rather high to use writer.write("..."). This misses an opportunity to take advantage of code other people have written and debugged. Let a SAX Transformer take care of encoding, name spaces and attribute handling. Use this sample as a starting point (yes SAX can be used to create a document - use a Stack to control nesting). If you have a complicated result you could use a DOM object to render your result which has the advantage that you can assemble it in a non-linear fashion
  • You can easily add an XSLT transformation to the output. This increases flexibility but costs performance, so it is a good idea to keep the Stylesheets precompiled in an application bean.
  • When you need to output XML from 2 different name spaces (e.g. XML and HTML) you can either use SAX's name space feature or you add the HTML in text sections. SAX takes care of encoding and add CDATA staements
As usual YMMV (and let me know how it goes). Other posts on rendering your own output in XPages:

Posted by on 09 December 2010 | Comments (0) | categories: XPages

Rethinking Getting Things Done - GGTD


I'm a big fan of David Allen's GTD and its implementation in Lotus Notes eProductivity for personal productivity. Staying on top of my commitments and priorities surly helps to keep blood pressure and frustration levels at bay.
Lately I started wondering: what's next? GTD is one of the techniques (like principled negotiation) that work even better when your counterparts use it too. So I thought how would GGTD need to look like: Getting Group Things done?
Musing about that opens an interesting set of questions around collaboration culture. It appears to me that " who can task whom with what" is one of the biggest enterprise taboos once you leave the realm of a call center.
An indicator for the sensitivity of that topic is the low use of group tasks in most organisations. If I remember correctly being able to send a task instead of an eMail is available in Lotus Notes for a decade. However I hardly see that in practise. Even my boss rather sends me an eMail with a request: "I need this and that report until this this deadline" than making it a task (could be an UI issue?). On the other hand, we do use Lotus Connections Activities quite a bit including the To Do functionality there (which opens the other can of worms: task list fragmentation).
So the question is: what are the "rules of engagement" for requesting more formal actions? Unstructured/informally task assignments happen all the time: " I need... Could you... Please provide ...". This is why one of the most used buttons in my use of eProductivity is "Copy into new Action". The list of questions is rather long (but far from complete):
  • What level of collaboration and trust allows me to create an action for you?
  • Does a task have the stigma of a "command" rather than being a "request"?
  • Do I need a mechanism to "suggest" an action?
  • Would I share my (GTD) project(s) with you or state the classifications I made?
  • What mechanism are acceptable to signal completion of actions and follow-on actions?
  • Would I give you anytime access to the list of topics I want to clarify with you? How would formal and informal workflows reflected in GGTD?
  • A decade ago I implemented the first version of EasyOffice that featured contextual workflows. Every correspondence (read: GTD project) had a person "currently in charge" with complete visibility for all project members. It would suggest "typical next actions" based on the project meta data. Would that help in a GGTD context?
  • One cornerstone of GTD is the trust I have in the system. If I participate in a group driven system, can I extent my trust there? Can I trust the participants and can I trust the technology?
  • What needs to be done to retain that sense of "I am in control" for all participants?
  • Will a shared task system make me liable for all my actions?
Looking at all these questions I see parallels to the introduction of calendaring and scheduling. I still encounter organizations and individuals who refuse to use a scheduling application since they don't want to be "machine driven" (My rule: you want my time, you send an invite. I do not copy meeting dates from eMails to calendar entries). So a lot of smart thinking needs to be done to get this right. And there is a next frontier: EGTD (make your guess what that means).

Posted by on 07 December 2010 | Comments (3) | categories: GTD

The Lotusphere 2011 sessions are published - and what to expect from me


The Lotusphere 2011 Session list has been published on the Lotusphere 2011 web site. Since my sessions in 2010 had been popular (thanks to Tim and Steve) I'll be back with 3 sessions:
  1. SHOW107: The DataSource Session: Take XPages data boldly where no XPages data has been taken before - with Jim Quill
    XPages out of the box ships with 2 Domino data sources. Other data could be read using Java libraries leaving them with all complexities around security, connections and state management. Using the XPages data source component model this complexity can be reduced to simple attribute selections. The session will introduce the details of the data source component model and show how to use it to connect to RDBMS, SOAP, REST, PureXML, CouchDB, oData and others. Leveraging the data source model enables automatic property support from Domino Designer and provides seamless integration with existing data bound XPages controls.
  2. AD103: Don't Be Afraid of Curly Brackets: JavaScript for IBM LotusScript Developers
    XPages uses JavaScript both on the server and the client side. The session offers an introduction to JavaScript for LotusScript developers. Focusing on server side JavaScript, this session highlights the similarities, differences and common pitfalls. Learn how to write JavaScript style, not just LotusScript in a JavaScript wrapper.
  3. AD107: Microsoft Sharepoint Meets IBM Lotus Domino: Development Deep Dive - with Justin Lee (aka triplez82 of hackerspace.sg fame) - That is Team Singapore!
    Lotus Notes and Microsoft Sharepoint aim for similar applications in the enterprise. This session will introduce Sharepoint concepts from a development point of view, compare them with Lotus Domino concepts and discuss integration points. Live code samples for using XPages to read and write from/to Sharepoint highlight the finer points of code integration. Finally, we'll discuss strategies to help stakeholders understand the process and benefits of migrating away from Sharepoint to the Lotus Domino collaborative application platform.
Presentation and demo creation is in full swing. If you have a special wish for the topics let me know now.
See you there.

Posted by on 07 December 2010 | Comments (1) | categories: Lotusphere

The Business of OpenSource Software


Richard Stallman the founder of the Free Software Foundation and inventor of the GNU General Public Licence (also known as CopyLeft) defined the 4 freedoms of free (where usually free is classified as "free as in speech" rather than "free as in beer") software as:
  1. The freedom to run the program, for any purpose
  2. The freedom to study how the program works, and change it to make it do what you wish. Access to the source code is a precondition for this
  3. The freedom to redistribute copies so you can help your neighbour
  4. The freedom to distribute copies of your modified versions to others. By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this
The freedoms sound like pledged when entering the a noble round of knights for the betterment of the world. One could easily conclude that such freedoms don't have any relevance outside a small group of enthusiasts. However that is far from reality. Apparently the Linux kernel is to a large extend maintained by paid for professional developers. So what's happening there?
A little economic theory goes a long way here: If a market provides high profit margins it will attract competitors who drive prices down until only marginal profits are earned (also known as Perfect Competition). One set of factors (there are others) that prevents perfect competition are known as Barriers to Entry. For software they are installed base (the network effect) and the huge investment needed to create new applications (also known as sunk cost). Looking at the profit margins of software companies, it becomes obvious that they run very attractive profit margins (the marginal cost for the second license of a developed software are practical 0 - not to confuse with "cost of doing business"), which point to high barriers of entry.
OpenSource lowers the sunk cost barrier by allowing to spread the market entry cost over many participants, so it becomes possible to compete. It is also a way to starve competitors off revenue in their key market to make it more difficult to compete in your own markets (history tells: if the war chests are empty agression subsides). OpenSource can be attractive for customers as well. I've not come across any larger customer where software didn't get heavily customised. The cycle looks mostly like this:
Sequence of upgrades and customisation for vendor provided Software
A vendor releases a software (e.g. SAP, Oracle, IBM or Microsoft) and customers engage professional service to adopt the product to their needs. This service can be rendered by the vendor's consultants, independent system integrators like Accenture, Wipro, TCS etc or by inhouse expertise. Problems or wishes in the core product are fed back to the vendor with the hope (and political pressure) for creation for the next version. Corporations pay maintenance but have little influence on the product manager's decision what will be next. Once the new version comes out the cycle of applying customisation starts anew.
In an OpenSource model the mechanism (ideally) would look different. Corporations would not pay for software licences but for know-how, implementation and help desk. By letting their staff or the staff of the chosen implementer (who could be the OpenSource project principal) become contributors they can yield a much bigger influence over the features and direction of the product:
Sequence of upgrades and customisation for Open Source Software
Money is spend tied to the implementation of specified features. Customisation would flow back into the core product, so once the next release is out these don't need to be reapplied. If a large scale customer disagrees with the general product direction they could fund a fork of the project and go their own way. As the example Debian/Ubuntu shows that separation doesn't need to be 100% but could be to still reap benefits from a shared code base. Also companies would gain the freedom to choose whom to ask to fix a bug (or implement a feature) out of a release cycle. This way they can reduce the total bill (Part of the profit margin stays in the corporation). The lower licensing cost will probably require higher consulting fees. It would be interesting to "run the numbers" and include productivity gains by better tailored software. One big OpenSource platform that is driven by customers (and academia) is OW2 Consortium with an impressive list of infrastructure software. The big wildcard in this scenario are the system integrators. So far I haven't seen them pushing the model: "Let us provide service and support and customisation for your OpenSource". It could be that one one side they don't want to endanger their supplier relations with the large software houses and on the other side the idea, that a company simply could give back their customisation to the core product (and thus potentially to the competition), seems rather alien for business people. Another reason could be that OpenSource is perceived as "risky". Anyway the big vendors have understood this threat, hence the fierce drive to move everything into the (propriety) cloud. We live in interesting times.

Posted by on 06 December 2010 | Comments (1) | categories: Business Software

Loading HTML or XML Content in LotusScript over HTTP


Your application needs data that are stored on a web server. If that data is available through a web service your are lucky. Since R8 web service clients are supported in LotusScript. If you want to load data from a URL you are out of luck. Typically you would resort to ActiveX and use the IE component to do the retrieval which introduces 3 evils: a Windows dependency, an IE dependency and an ActiveX dependency. The other way is to use Java, which turns a lot of LotusScript developer off. The solution is to use a ready made library that can wraps all the Java you need into a convenient LotusScript class. The use case I had was to read HTML from a remote site and return a specific table for further processing. So my class has an XPath parameter that allows to slice out some part of the returned HTML. This is how you would use it in LotusScript:
%REM
    Agent UpdateHTMLOnChange
    Created May 28, 2010 by Stephan H Wissel
    Description: Reads all documents that have been flagged
    as changed and retrieves the update HTML
%END REM

Option Public
Option Declare


Use "HTTPUpdatesLS"

   
Sub Initialize
    Dim updateClass As HTTPUpdates
    Set updateClass = New HTTPUpdates
    Call updateClass. UpdatePendingDocuments ( )
    Set updateClass = nothing
End Sub
 

Read more

Posted by on 01 December 2010 | Comments (6) | categories: Show-N-Tell Thursday