BPM means Business Process Management. Let’s break this down into its constituent parts: “business process” and “management”. A process is a series of events that targets a certain outcome. So a business process would be some sort of activity that a business engages in that results in some sort of expected (or sometimes unexpected) outcome. For instance, when a business hires a new person, there is an entire process for that. It is referred to as “on-boarding”. Different companies will scope on-boarding differently. Some would count the recruitment phase as part of the on-boarding process. Others would consider recruitment an entirely different process, in which case the recruitment process may or may not lead into the on-boarding process. As we can see, we have a potential breakdown in the process, or at least a difference in opinion of what the actual process is.
A memory database is a database that is loaded fully into memory, and managed by a controller process similar to an on-disk database management system. Their are many challenges that arise when implementing CRUD in a memory database, including transaction handling, data synchronization with the data source, and data updates against the data source. To start with, many of the memory database products that are out there, require the application developer to manually load a database into the memory store. This puts a big responsibility on every application that must use the in-memory database. Every application must contain code specifically for loading the database into the memory database, which can amount to a significant amount of coding. Another approach taken by most of the memory database products is to force the application developer to register listeners in order to listen to changes to data in the memory database. The developer is then responsible to figure out and code the logic to update the proper tables and rows based on the changes dispatched by the in-memory database, this can be very complex and challenging, depending on the type of information that gets received from the event notifier. The other major shortcoming of many of the in-memory databases is their inability to provide transaction management functionality. This is a huge deal for applications that depend heavily on transactions, including financial and other similar types of applications.
To resolve the first issue of relieving the developer from having to manually load the database into the in-memory database, an in-memory database can implement functionality to allow the developer to simply pass a JDBC or Ado.NET connection string, and the in-memory database is then able to discover the schema of the database by connecting to it using the connection string, and calling the various schema discovery methods that are provided by both interfaces. This solution is extremely simple to implement, and can have huge impact on the popularity on the in-memory database product. By implementing that logic inside the in-memory database code, application developers that use that product are no longer required to write code specifically to load the database into that product.
By resolving the issue of loading the database within the in-memory database product, the second issue of updating the database with any changes in the memory database become much more manageable. Because the in-memory database can now connect and talk to the on-disk database using JDBC or ADO.NET, it can generate the required SQL syntax to update the database directly without having to notify the application listeners to take care of that. This relieves the developer from having to write code to manage the updates.
The last but least issue to resolve is the transaction management of data updates. All commercial on-disk database products include this functionality to manage transactional updates of data. This is done by using a combination of temporary files and log files to temporarily save updates that belong to a group of updates until a commit transaction is triggered by the application code. The database product can then attempt to copy all the updates from the temporary store into the main database file(s). During this copying, writes may fail, and then the database ends up with partial transactions. To overcome this issue, the database server contains logic that is able to rollback the partial updates based on various techniques, including a snapshot of the original data before any updates begin, or other types of mechanism. This same type of logic can also be deployed by in-memory database products. Even though the database is supposed to be fully loaded in memory, that does not restrict the in-memory database server from utilizing the file system to temporarily store updates that occur within a transaction. By doing that, it combines the best of 2 worlds, in memory access to data which has optimal performance, and support for expected database capabilities such as transactions and CRUD in general.
Many software development shops have been using Microsoft’s Team Foundation Server for some years now. Software development teams have used Team Foundation Server for different purposes such as Integrated Source Control System, Team Project Management, QA Process integration and Build Process integration. Unfortunately, very few teams have been using it for all these purposes combined. For years we have seen silo teams of software development, QA, Project Management and Product Management with their separate set of tools that seldom collaborate. Even if they are made to collaborate, it is usually through data duplication across all the tools usually with the help of Excel or CSV data extracts. So, either the Project and Product Managers have to be really good at data transfer and manipulation or someone has to spend a lot of time ‘integrating’ these tools. As a result of these Silos, the teams as well as managements are not able to get a complete picture of their Project initiatives.
A limited number of vendors provide ALM as a ‘completely integrated environment’ and one of the leaders in this space is Microsoft’s Team Foundation Server (TFS). Gartner has ranked Microsoft TFS highest in Completeness of Vision and Ability to Execute in its ALM Magic Quadrant 2012.
Let’s focus on TFS 2012 Team Project Management features in this blog. TFS Team Explorer which is an integral part of the Visual Studio IDE also has a TFS Web Access component that was intended to be used by Project and Product Managers along with software and QA teams. There has been limited adoption of TFS by these project stakeholders due to the perceptions that:
- TFS is basically a software developer collaboration tool
- Additional licenses of Visual Studio would be needed for stakeholders who wish to use Team Explorer
- It does not have a clean UI for Agile and Scrum development. Online SaaS tools are more intuitive and productive
- TFS did not have a Task Board like other SaaS tools for daily stand ups and easier task management
- TFS Installation and configuration is challenging
- There is significant learning curve for Product and Project Managers
While some perceptions might have had some justification in earlier versions of TFS, but all of these issues have been ameliorated in the latest version named Team Foundation Server 2012. There are clearly visible reasons for this statement:
- TFS provides an end-to-end integration for all ALM phases: Requirements, Project Management, Source Control, QA Process, Build Process and Reporting. This could make the ALM process highly streamlined and efficient, if implemented correctly.
- TFS Web has been improved significantly for use by Agile/Scrum Project Managers with features like multiple Work Item update, Task Board and a sleek looking dashboard style home page for each Team Project.
- New drag-and-drop features on TFS Web are appealing as well as intuitive.
- Process template and Work Item customization tools that enable modification to the ALM process and its components such as Backlog Items, Tasks, Bugs, etc.
- Scalable and robust Reporting features that feed on data from a dedicated TFS Data Warehouse. This allows creation of cross-project reports for better Program and Portfolio management.
- From easy to implement and use out-of-the-box templates for smaller teams to complex customizable features for large teams that are looking to optimize their software product development and ALM process, TFS 2012 fits the needs of all types of teams and environments.
- TFS 2012 is now being offered as SaaS by some providers
The next big question for ALM decision makers is: How do I implement and customize these TFS features in such a way that my organization is able to maximize its ROI in software development through process streamlining and team collaboration?
The ALM team at XCentium specializes in implementation, configuration, and customization of TFS 2012 and is equipped to join the dots from the software development, QA process, and Project and Product Management right up to the Executive Management of the organization in a way that is beneficial for each of these stakeholders. My subsequent blogs will discuss some enhancements that XCentium has made to TFS 2012 that add project budget and cost reporting along with capacity planning for enterprise teams. Our goal is to maximize the ALM benefits of TFS 2012 by customizing the existing features to suit our clients’ needs and to add some highly desired value additions to it for team and project financial reporting. Please feel free to contact us if you are thinking of making your ALM and Project Management Process accountable and more productive.
Web services have been around for over a decade, and over the years, there have been many advancements in web service technologies, and many changes to the protocols supported by web services. In the initial stages of web services standardization process, most influential elements of the programming society, such as Microsoft, IBM, SUN, ORACLE, and other similar companies worldwide, agreed on the standard of SOAP. Continue reading
One of the fun parts of consulting is getting to work with people who are willing to explain new concepts. XCentium has a relationship with Dr Craig Miller who works in the Simulation and Modeling arena. I recently got a chance to work with him to use a technique called RFM to demonstrate customer segmentation to one of our clients. This is how we used Tableau for this purpose. Continue reading
The other day one of my bosses asked me the question, “What is a CPU on a Virtual Machine?” I inherently knew the answer, but putting that knowledge into non technical English turned out to be harder than I thought. As a result of the ensuing conversations, I thought I’d do a little write-up
The root of the question came from a situation where a client had requested a bump in CPU resources for one of their virtual machines. Correction, the client had actually asked for another CPU. The language is important and speaks to the root of the question. The client then looked at their laptop as a basis for evaluating that additional CPU. What they had on their laptop was a dual-core CPU running in Hyper-threaded mode. So even though the client had only one “CPU”, they were seeing 4 panes for activity tracking in Windows Performance Monitor. Now, in my mind, that laptop has 4 CPUs. In the client’s mind, that laptop had only one CPU. So where does the difference in opinion come from?
Posted in Biz Cloud Blog, General
Tagged Amazon AWS, AWS, Best practices, cloud, Cloud 101, Cloud basics, Cloud introduction, Cloud Services, Different cloud vendors, EC2, Platform as a service, SaaS, Software as a service
Upon investigation we found that Sitecore uses ComponentArt Grid to render the UI and it does not work well with minified scripts and css. I am not really sure why but we did fix it by modifying the MiniMe handler to only allow minification if the context site is not the “shell” site. Here is what the code looks like.
It is not really that difficult to see – the writing is clearly on the wall: Responsive is here to stay. The positive consequences beyond just device support is less obvious at first glance… Continue reading
Currently in Sitecore 6.6.0 (rev. 130214) . If you have a link with attributes like class or target in the <a tag, when you use the Insert Sitecore Link in the HTML editor to overwrite your current link your attributes are removed. I have not tested this in other versions, but I have submitted it to Sitecore support and it has been recorded as a bug. Along with this support pointed me in the direction of a work around. Continue reading
MIT Sloan in collaboration with software firm SAS recently published their Spring 2013 Research Report: From Value to Vision: Reimaging the Possible with Data Analytics. The premise is that some companies are great at analytics while most are mediocre. It is a theme other top researchers such Brynjolfsson, Laursen & Thorlund, and Davenport have addressed. The MIT Report offers evidence of superior returns for these analytic leaders along with anecdotes and attributes. My view is the essential difference between the analytic stars and the rest is that the stars think beyond using technology as a transactional tool and strive to convert transactional data into actionable information that can provide a competitive advantage in the marketplace. They recognize the finite returns to cost saving and efficiency initiatives.
In this short piece I will identify several specific opportunities that all firms have access to by leveraging their transactional databases. The three categories selected are particularly useful to Marketing decision-makers. Continue reading