Windows 2012 Hosting - MVC 6 and SQL 2014 BLOG

Tutorial and Articles about Windows Hosting, SQL Hosting, MVC Hosting, and Silverlight Hosting

SQL 2014 Hosting - ASPHostPortal :: Make Your SSAS Works Like a Private Jet!

clock December 10, 2015 19:54 by author Jervis

SQL Server Analysis Services (SSAS) Tabular is a popular choice as an analytical engine for many customers. With its state-of-the-art compression algorithms, multi-threaded query processor and in-memory capabilities, SSAS Tabular can provide super quick access to data by reporting client applications. However, as a consultant, I have been called by many clients to resolve slow query performance when accessing data from SSAS Tabular models. My experiences have taught me most, if not all, of the performance issues can be resolved by taking care of the following five subject areas. 

Estimate Current Size and Growth Carefully

Tabular models compress data really well and on an average, you can expect to see 10x the compression rates (though it can be much more or less depending on the cardinality of your data). However, when you are estimating the size of your model as well as future growth, a rough figure like this is not going to be optimal. If you already have data sitting in a data warehouse, import a subset of that data — say a month — to find the model size for that and then extrapolate the value based on the required number of rows / years as well as the number of columns. If not, try to at least get a subset of the data from the source to find the model size. There are tools like BISM Memory Report and Vertipaq Analyzer that can further help in this process.

It is also important to record the number of users who will be accessing the system currently as well as the estimated growth for the number of users.

Select or Upgrade Hardware Appropriately

Everyone knows SSAS Tabular is a memory intensive application, and one major issue I have seen is only the RAM is considered when hardware selections are made. However, just focusing on the RAM is not enough and there are a lot of other variables. Suppose all the other variables are constant and there is an unlimited budget, these are the recommendations:

CPU Speed – The faster, the better, will help in computing results faster especially when there is a bottleneck on the single-threaded formula engine.

CPU Cores – In theory, the more the better as it helps in managing concurrent user loads. However, in reality, a rise in the number of cores usually corresponds to a decrease in the CPU speed due to management overload. So a balanced approach has to be taken when determining the CPU Cores and Speed. Also, licensing cost increases with the number of cores for many software.

CPU Sockets – The lesser, the better as SSAS Tabular is not NUMA aware till SQL 2014. However, this is expected to change in SQL 2016 where some NUMA optimization has been made. For large tabular models, it might be a challenge to go single socket as the amount of RAM that can be supported on a system will depend on the CPU sockets.

CPU Cache – The more, the better. Retrieving data from CPU caches are 10-100x faster than retrieving data from RAM.

CPU Architecture – The newer, the better due to the hardware performance optimizations. For eg, Intel Xeon processors with Haswell architecture is always going to be faster than Sandy architecture keeping all other variables constant.

Amount of RAM – Should have at least 2.5x the model size, if the model is going to be processed on the same server. The amount of RAM can be lesser in cases of certain scale out architectures where the model is processed in a separate server.

RAM Speed – The faster, the better (yes, RAMs have speed too!) This is very important for a memory-bound application like Tabular and should always go for the faster speeds, if budget allows.

Storage – Not important at all as it does not have any effect on query performance. However, if budget allows, it might not be a bad idea to get faster storage like SSDs, as that will help in maintenance related activities like backup, storage or even getting the tabular model online faster when the service is restarted. Apart from this, there are other factors also like network latency, server architecture (scale out), etc that have to be considered, but depending on the budget and specific customer requirements, a balanced approach will have to be made.

Design the Data Model Properly

Tabular is really good at performance and in the case of small models, is extremely forgiving in terms of bad design. However, when the amount of data grows, performance problems begin to show up. In theory, you will get the best performance in SSAS tabular if the entire data is flattened into a single table. However, in reality, this would translate to an extremely bad user experience as well as a lengthy and expensive ETL process. So the best practice is to have a star schema, generally. Also, it is recommended to only include the relevant columns from the source tables, as increasing the columns will result in an increase in model size which in turn will result in slower query performances. Increase in number of rows might still be ok as long as the cardinality of the columns don’t change much.

Depending on the specific customer requirements, there could be deviations from the best practices. For e.g., we built custom aggregate tables along with the detailed fact table in the case of a very large production model for a client. The resultant measure had a conditional statement to retrieve data from the aggregate table if the detailed level dimension data was not used in the report. Since the aggregate table was only 1/10 the size of the detailed fact table, the query came out 10x times faster whenever the details were not used, which was almost 90% of the times.

Optimize the DAX Calculations

In case of small models, Tabular is extremely forgiving in terms of bad DAX code also. However, just like in the case of bad design, performance takes a hit for the worse as you increase the data, add more users, or run complex queries. DAX performance tuning is the most difficult to tune from the current list, and it is important to have a strategy for maintaining and tuning the performance. A good place to start would be the Performance Tuning of Tabular models in SSAS 2012 whitepaper.

Monitor User Query Patterns and Train Users

Once your model is in production, it is important to keep monitoring the user query patterns as well as the resources to see potential bottlenecks. Through this, you can find whether the performance issues are being caused due to inefficient DAX, bad design, insufficient resources or most importantly, whether it is just because a user is using the model inefficiently. For e.g., in one of the cases, we found out the slow performance for all users was due to a single user dumping the entire 100 GB model into spreadsheets so he could perform custom calculations on top of it. This blocked the queries for all the other users and made things really slow for them. With appropriate requirement gatherings, we ensured all the required calculations for that user were there in the model and then trained the user to use the model for his analytics.

The success of any tabular project depends on the adoption by the end users and it is needless to say the adoption would be much better if the system is fast. These 5 tips will ensure you already have a jumpstart on that journey.

Looking to Use SQL Server Analysis Services Hosting?

As this is intermediate service, you can get this SQL Server Analysis Services on our Windows Dedicated server plan. You can start from our Bundled Cloud Platinum Class to get this service running on your server. This plan has included

- Windows Server 2008/2012 license
- 2 x 2.0 Ghz Core
- 4 GB RAM --> FREE Upgrade to 8 GB RAM (Use Promo Code ‘DOUBLERAM’)
- 2 x 500 GB Storage
- 20000 GB Bandwith
- 1000 Mbps Connection speed
- 4 Static IP
- Full 24 x 7 RDP Access
- Full 24/7 Firewall Protection
- Standard ver. SQL 2008/2012
- Support MySQL db
- PLESK Control Panel
- Antivirus
- Unlimited of MSSQL dbs
- Unlimited of MySQL dbs
- FREE SmarterMail service if you register this December

Get a HUGE discount for this Christmas!! For more information, please visit our official site at http://www.asphostportal.com.



SQL Server 2008 R2 Hosting - ASPHostPortal :: Microsoft SQL Server 2008 R2 PowerPivot Planning and Deployment

clock July 17, 2010 08:40 by author Jervis

Microsoft® SQL Server® PowerPivot is an innovative data analysis technology that redefines how organizations of all kinds deliver and succeed with business intelligence (BI). The focus shifts from IT delivering corporate BI solutions toa managed BI collaboration environment that gives users the power to get timely and reliable information to make more relevant decisions. PowerPivot does not replace corporate BI, but complements it with managed, self-service solutions.

Providing business insights to all employees means giving producers of intelligence access to the best data analysis tools and reliable access to trustworthy data, as well as facilitating knowledge-sharing and collaboration within teamsof producers and consumers of intelligence and across departmental boundaries. It means leveraging the network of power users in each department, the ones who create departmental and team solutions, assist colleagues in ad-hoc analysis, and, when a centralized solution is best, communicate BI requirements back to the IT department. It also means implementing reliable BI monitoring and management processes to ensure availability and performance for mission-critical, self-service BI applications. PowerPivot enables organizations to extend the reach of BI in the enterprise from corporate to team and individual spaces, while at the same time increasing IT management and operations efficiency.


PowerPivot integrates with Microsoft Office Excel® 2010 to give users unmatched computational power for advanced data analysis with a familiar user interface. PowerPivot also integrates with Microsoft SharePoint® Server 2010 to establish a managed, self-service BI environment that takes advantage of all the standard SharePoint features, such as role-based security, compliance policies, workflows, and versioning, and introduces new features and capabilities, such as PowerPivot Gallery and automatic data refreshing for shared workbook applications. And PowerPivot features a management dashboard that lets users monitor shared applications, track usage patterns over time, drill down to reveal hidden details, discover mission-critical solutions, and make sure appropriate server resources are provisioned.


Establishing a managed, self-service BI environment entails deploying the PowerPivot for Excel add-in on workstations running Microsoft Office 2010 and PowerPivot for SharePoint on SharePoint application servers. These deployments can be performed independently. The PowerPivot for Excel add-in does not require a SharePoint environment and the SharePoint environment does not require PowerPivot for Excel on all workstations. The Excel add-in is a requirement only for those creating and publishing workbook applications. Other users can access published workbook applications in SharePoint via a Web browser, with the same performance and most of the features as the Excel client. However, planning and coordination are required to determine the best deployment sequence and configuration options, provision adequate storage capacities and system resources, and optimize the managed BI collaboration environment for high availability and performance.

This white paper contains information for technical decision makers, IT administrators, and system architects who are planning to deploy PowerPivot technology in an enterprise environment. This paper assumes the audience is already familiar with the Windows Server® operating system, Microsoft Office, SharePoint, SQL Server, online analytical processing (OLAP), and self-service BI. A high-level understanding of the new features and technologies in Microsoft SQL Server 2008 R2 is also helpful.

If you search more information about SQL Server 2008 R2 hosting, you can visit ASPHostPortal.

Reasons why you must trust ASPHostPortal.com

Every provider will tell you how they treat their support, uptime, expertise, guarantees, etc., are. Take a close look. What they're really offering you is nothing close to what ASPHostPortal does. You will be treated with respect and provided the courtesy and service you would expect from a world-class web hosting business.

You’ll have highly trained, skilled professional technical support people ready, willing, and wanting to help you 24 hours a day. Your web hosting account servers are monitored from three monitoring points, with two alert points, every minute, 24 hours a day, 7 days a week, 365 days a year. The followings are the list of other added- benefits you can find when hosting with us:

- DELL Hardware
Dell hardware is engineered to keep critical enterprise applications running around the clock with clustered solutions fully tested and certified by Dell and other leading operating system and application providers.
- Recovery Systems
Recovery becomes easy and seamless with our fully managed backup services. We monitor your server to ensure your data is properly backed up and recoverable so when the time comes, you can easily repair or recover your data.
- Control Panel
We provide one of the most comprehensive customer control panels available. Providing maximum control and ease of use, our Control Panel serves as the central management point for your ASPHostPortal account. You’ll use a flexible, powerful hosting control panel that will give you direct control over your web hosting account. Our control panel and systems configuration is fully automated and this means your settings are configured automatically and instantly.
- Excellent Expertise in Technology
The reason we can provide you with a great amount of power, flexibility, and simplicity at such a discounted price is due to incredible efficiencies within our business. We have not just been providing hosting for many clients for years, we have also been researching, developing, and innovating every aspect of our operations, systems, procedures, strategy, management, and teams. Our operations are based on a continual improvement program where we review thousands of systems, operational and management metrics in real-time, to fine-tune every aspect of our operation and activities. We continually train and retrain all people in our teams. We provide all people in our teams with the time, space, and inspiration to research, understand, and explore the Internet in search of greater knowledge. We do this while providing you with the best hosting services for the lowest possible price.
- Data Center
ASPHostPortal modular Tier-3 data center was specifically designed to be a world-class web hosting facility totally dedicated to uncompromised performance and security
- Monitoring Services
From the moment your server is connected to our network it is monitored for connectivity, disk, memory and CPU utilization - as well as hardware failures. Our engineers are alerted to potential issues before they become critical.
- Network
ASPHostPortal has architected its network like no other hosting company. Every facet of our network infrastructure scales to gigabit speeds with no single point of failure.
- Security
Network security and the security of your server are ASPHostPortal's top priorities. Our security team is constantly monitoring the entire network for unusual or suspicious behavior so that when it is detected we can address the issue before our network or your server is affected.
- Support Services
Engineers staff our data center 24 hours a day, 7 days a week, 365 days a year to manage the network infrastructure and oversee top-of-the-line servers that host our clients' critical sites and services.

 



About ASPHostPortal.com

We’re a company that works differently to most. Value is what we output and help our customers achieve, not how much money we put in the bank. It’s not because we are altruistic. It’s based on an even simpler principle. "Do good things, and good things will come to you".

Success for us is something that is continually experienced, not something that is reached. For us it is all about the experience – more than the journey. Life is a continual experience. We see the Internet as being an incredible amplifier to the experience of life for all of us. It can help humanity come together to explode in knowledge exploration and discussion. It is continual enlightenment of new ideas, experiences, and passions


Author Link


Corporate Address (Location)

ASPHostPortal
170 W 56th Street, Suite 121
New York, NY 10019
United States

Sign in