Blockchains for Enterprise

Permissioned blockchains are being developed and used in Enterprise scenarios in which higher transaction throughput, lower latency, versatile confidentiality and governance models are more important than supporting low trust levels among participants. These Enterprise requirements are not being met adequately by the public and permission-less blockchain ledgers. Enterprise scenarios typically have known actors and the trust level between these actors is very high. This allows for trade-offs between trust levels and transaction throughput, speed, confidentiality and privacy, which in turn enables alternate solutions via permissioned blockchains.

Hyperledger Fabric (referred to as ‘Fabric’ in the rest of this article) is the leading permissioned blockchain framework, initially developed by IBM. There are other players in this space, including Multichain, ChainCore and the relatively new entrant – Microsoft’s CoCo framework (referred to as ‘CoCo’ in the rest of this article). Ethereum’s own developers are also working on enabling Enterprise use cases. The Enterprise Ethereum Alliance is a community of industry experts developing features and enhancements to Ethereum for Enterprise needs. This article focuses on Hyperledger Fabric and CoCo framework only.

The projects in the Linux Foundation’s Hyperledger suite are a collection of solutions based on various blockchain technologies. The Hyperledger architecture allows for different components to be plugged in to serve various use-cases, trust level and fault tolerance requirements. The ledgers in this collection are not components of one complete solution, but standalone solutions with various differentiated characteristics. You can choose the one applicable to your business scenario and requirements. Fabric is one of those solutions and it is a complete blockchain solution, including blockchain protocol(s) and smart contract (called Chain Code in Fabric architecture) execution. The tools in the Hyperledger collection (Cello, Composer and Explorer) do work as a component of a complete solution – mostly with Fabric at the moment, as far as I can tell. Hyperledger is available for production use and and it is already being used in developing real-world blockchain solutions.

Coco is an open source platform created by Microsoft. It’s not a standalone blockchain protocol, it is a framework which can be integrated with blockchain protocols such as Ethereum, Quorum, Corda and also Hyperledger Sawtooth to deliver complete, enterprise grade blockchain solutions. The integration with Ethereum is the result of an adapted Ethereum codebase, which is integrated with CoCo Framework. Ethereum provides Smart Contract code execution, transaction processing and distributed ledger model. CoCo framework provides secure execution environment, flexible confidentiality models and codified governance. CoCo Framework was announced in August 2017. There is a whitepaper and a demo of a Proof-Of-Concept integration of CoCo with Ethereum. Ethereum integration is being hardened, integration with other blockchain platforms (Quorum, Corda and Sawtooth) is in progress and the CoCo source code will be open sourced in early 2018.

CoCo provides smart contract code and execution security via Trusted Execution Environments (TEE). The TEEs can be software (Hyper-V VSM) based or hardware (Intel SGX) based. Fabric does not include guarantees of execution integrity via TEEs, although another ledger (Hyperledger Sawtooth) in the collection does. Hyperledger Sawtooth’s use of TEEs is limited to ensuring the integrity of its leader election consensus algorithm only. CoCo’s use of TEEs seem to be more encompassing and it includes execution guarantees and security for the smart contract code itself.

Clearly, Fabric and CoCo take very different architectural approach. There are functional similarities between the two frameworks. For example, the Channels in Fabric provide the mechanism for flexible confidentiality models, which is provided by ACLs in CoCo. TEEs in CoCo provide the same function as Endorsers in Fabric. Fabric has the concept of Validator nodes, such nodes are not needed in CoCo due to TEEs. Composer is a tool to quickly spin up proof-of-concept implementations with Fabric. Microsoft’s App Builder is a tool (I am currently participating in its private preview) to do the same with Ethereum as well as Fabric. App Builder does not currently work with CoCo, but I will not be surprised to see CoCo support in the future.

The default consensus algorithm in CoCo, Paxos consensus, does not provide Byzantine fault tolerance. On the surface, it appears to be a letdown, but Byzantine fault tolerance is made (mostly) redundant due to use of TEEs. There are some considerations to mitigate compromise of CoCo VNs (Validating Nodes). Refer to the CoCo whitepaper for more information on that. Note that the consensus algorithm in CoCo is pluggable, like Fabric. The business logic of the Smart Contracts in CoCo can be written in the programming language supported by the underlying execution engine. Which means, you can use Solidity to write your Smart Contracts in CoCo today.  You can use Go to write smart contracts in Fabric.

The choice of blockchain framework, ledger and consensus model requires review and analysis of your particular business use case which translates to throughput, latency, confidentiality, governance and network requirements.

References:

 

Image Credit: DavidstankiewiczBlockchain Illustration 2CC BY-SA 4.0

Hyperledger, IBM, Microsoft, Coco, Etheruem and associated trademarks belong to their respective owners.

Paste Table from Microsoft Excel to Confluence Wiki

1. Create the table in Microsoft Excel.

paste-table-confluence-excel paste-table-confluence-toolbar

2. Open this page in browser: http://excel2jira.bluurgh.com/

3. Paste the table from Microsoft Excel into the input box on the page.

paste-table-confluence-convert-markup

4. Click on “Convert Me Now” button

5. Place the cursor at the intended insertion point in the wiki document.

6. Click on “Insert more content” (+) button on Wiki toolbar

paste-table-confluence-toolbar

7. Click on “Markup”

paste-table-confluence-markup-cmd

8. Paste converted markup from step 2  in the left pane and verify the preview.

paste-table-confluence-insert

9. Click on “Insert”

10. Review and verify the inserted content and make any correction if needed.

 

Ethereum Blockchain-As-A-Service in Azure Cloud

Ethereum Blockchain (EBaas) as a Service provided by Microsoft Azure and ConsenSys allows for enterprise customers and partners to play, learn, and fail fast at a low cost in a ready-made dev/test/production environment. It will allow them to create private, public and consortium based Blockchain environments very quickly in Azure. In this session, you will learn how to get started with prototyping building blocks of a decentralized application using EBaas in Windows Azure.

Venue: Triangle Azure User Group

Slides : download.

Getting Started with Smart Contracts on Ethereum Blockchain using Visual Studio

Smart Contracts are an exciting innovation built on blockchain technology. It is a way to execute code in a trustless, decentralized and transparent system. Ethereum is a decentralized platform to run smart contracts using a variation of Bitcoin’s blockchain technology.

In this session, you will learn how to write smart contracts in Visual Studio. You will learn how to deploy them to public Ethereum blockchain and a private/consortium blockchain as a service in Azure. You will be introduced you to Solidity – the programming language used to write smart contracts. You will get familiar with the tools and technology around this exciting, promising and relatively new innovation.

Presentation Slides are here: download.

KeyNode with Node.js and Microsoft Azure

KeyNode is a application to issue and verify software license keys. Technology stack for KeyNode is Node.js, MongoDB and Microsoft Azure.

I had built this functionality with C9.io (a cloud-based IDE with a built-in source code repository and debugger), mongohq (MongoDB as a service – now part of compose.io) and appfog (Cloud PAAS built on top of CloudFoundry). It used SMTP/gmail to email license files. That was the version I created a couple of years ago to issue tamper-proof signed xml license files for CodeDemo (a code snippet tool for developers, presenters and instructors).

For KeyNode (open source) I switched to a different toolset : Visual Studio Code and Windows Azure, simplified the code to remove signed xml file and open-sourced it on GitHub. Signed xml allowed offline verification in CodeDemo (a Wpf/Desktop app). Removing signed xml requires verification to happen online. I am working on adding the web endpoint for verification of license keys. This version uses SendGrid to email license keys. KeyNode is deployed as a Windows Azure Web App. The Azure Web App is on Continuous Deployment feed from the source code repository on GitHub.

I created and tested this Node.js application locally without IIS and deployed it as an Azure Web App without making any changes to the code at all. Node.js applications are hosted in Azure under IIS with iisnode. Iisnode is a native IIS module that allows hosting of node.js applications in IIS on Windows. Read more about iisnode here. Iisnode architecture also makes it significantly easier to take advantages of scalability afforded by Azure.

KeyNode is a work in progress. My plan is to use this as the basis for further explorations in the following areas :

  • DevOps, Docker and Microservices (at miniature scale of course!)
  • Create a Web UI with Express (a Node.js web application framework)
  • Integrate with Azure Storage/Queues
  • and more…

I invite you to check out the live site on Azure and fork it for your own experiments : KeyNode on GitHub.

Resources :

Photo Credit : Piano Keyboard (www.kpmalinowski.pl)

Using Powershell with Splunk

Splunk Powershell Resource Kit is a convenient and very capable wrapper over Splunk REST API. You can use the Powershell commandlets exposed by this resource kit to deploy, check and manage splunk services as well as execute splunk searches. In this post, you will be introduced to the Splunk Powershell Resource Kit, you will learn how to use powershell commandlets to connect to a splunk instance and execute searches.

1. First, you will need to download the resource kit from github.
2. Installation is very simple. All you have to do is download and extract the files from the zip archive and double click on install.bat to install the splunk powershell module.
3. Open Windows Powershell console from Windows Start menu.
4. Verify that the Splunk module is installed by executing Get-Module commandlet.

Get-Module SplunkSearch

5. Import splunk resource kit commandlets using Import-Module command.

Import-Module -Name Splunk

6. Next, you need to use Get-Credentials and then Connect-Splunk commandlets to connect to splunk. You need to do this once per session or if you need to switch to a different splunk instance.

$credential = Get-Credential
Connect-Splunk -Credential $credential -ComputerName localhost

I have a local Splunk Enterprise instance running on my machine, so I am using localhost as the ComputerName to connect to it. If you have a SplunkCloud subscription you can use YourSubscriptionId.splunkcloud.com as the ComputerName to connect to your subscription. Like so –

Connect-Splunk -Credential $credential -ComputerName MySubscription.splunkcloud.com

7. Next, use Search-Splunk commandlet to execute searches –

Search-Splunk -Search "Error"

Here is a sample script:

$lastDay = ( get-date ).addDays( -1 ).toString( ‘s’ )

$searches = @(
    "ERROR"
    , "source=""tutorialdata.zip:*"" ERROR"
    , "CreditDoesNotMatch"
    ,"source=""tutorialdata.zip:.\\www3/access.log"" productId=WC-SH-G04"
)

Write-Output $lastDay
foreach($search in $searches)
{
    $qry = $search + " | stats count"
    Write-Output $qry
    Search-Splunk -Search $qry -StartTime $lastDay | Select-Object -ExpandProperty Count
}

The sample script executes multiple Splunk searches and outputs the count of results matching these search queries. Note that I am “-StartTime” parameter to scope the search to a narrower time window and “stats count” command to get the count of results. You can get this sample script as a github gist.

You can also use -EndTime and –MaxReturnCount to further constrain the query results and Format-List, Format-Table, Format-Wide commands to format the results. You can learn more about other Search parameters as well as other capabilities exposed in the resource kit documentation.

Resources :

  1. Splunk Powershell ResourceKit
  2. Splunk Powershell ResourceKit on github
  3. Splunk Powershell ResourceKit Documentation

Image credit : terminal by Andrea Mazzini from the Noun Project