:: Home
:: Services
:: Customers
:: Publications
:: About Us
:: Partners
:: Events
:: Payments
:: Contact Us
:: Home
:: Services

System Architecture and Scalability Analysis (Application, Server, Database, Operating System & Storage)

Why does an application component function within a "reasonable" response time goal when the table(s) are small and degrades in "exponential" fashion when the table sizes grow? You expect "linear" degradation, but the results are surprisingly otherwise. Ring a bell?

The core problem here is that the application and/or its "environment" has inherent scalability issues. What works well with small data-sets, completely falls apart with large data-sets. We pride in ourselves to be able to determine scalability issues with such components in your system.

In Computer Science, the scalability of a code module is usually measured by its "algorithmic complexity". In simple terms, the algorithmic complexity is a mathematical measurement of how well the code will perform. The measurement of algorithmic complexity is directly related to the abstract data types and data structures that are utilized in the application. For example, linked lists portray an algorithmic complexity of O(n). This means that if the linked list is comprised of "n nodes" the algorithm manipulating may require up to "n node traversals" to get to the data that is sought after. In comparison, a binary tree's algorithmic complexity is O(logn). Given the same number of nodes, a binary tree provides much more scalable performance when compared to a linked list.

Scalability problems can (and will) stem from bad application design, inappropriate data structures, inefficient SQL, system resource limits, database and schema architecture constraints and improper storage configuration. Early detection, diagnosis and resolution of scalability problems is key for stable system performance health in the long run.

Click on any link in the box on the right for a detailed description of that service.