Windows 2012 Hosting - MVC 6 and SQL 2014 BLOG

Tutorial and Articles about Windows Hosting, SQL Hosting, MVC Hosting, and Silverlight Hosting

ASP.NET MVC 6 Hosting - ASPHostPortal :: Creating Custom Controller Factory ASP.NET MVC

clock February 20, 2016 00:01 by author Jervis

I was reading about “Control application behavior by using MVC extensibility points” which is one of the objectives for the 70-486 Microsoft certification, and it was not clear to me, the explanation provided. So I decided to write about it to make clear for me, and I hope this help you as well.

An ASP.NET MVC application contains the following class:

public class HomeController: Controller
{
   public HomeController(Ilogger logger)//notice the parameter in the constructor
   { 

   }
}

This throw an error with the DefaultControllerFactory see image below.

The application won’t be able to load the Home controller because it have a parameter. You need to ensure that Home Controller can be instantiated by the MVC framework. In order to accomplish this we are going to use dependency injection.

The solution is to create a custom controller factory.

It calls the default constructor of that class. To have the MVC framework create controller class instances for constructors that have parameters, you must use a custom controller factory. To accomplish this, you create a class that implements IControllerFactory and implement its methods. You then call the SetControllerFactory method of the ControllerBuilder class to an instance of your custom controller factory class.

Create the CustomControllerFactory that inherit from IControllerFactory:

public class CustomControllerFactory : IControllerFactory
   { 

       public CustomControllerFactory()
       {
       }  


       public IController CreateController(System.Web.Routing.RequestContext requestContext, string controllerName)
       {
            ILogger logger = new DefaultLogger();
       var controller = new HomeController(logger);
       return controller;
       } 

       public System.Web.SessionState.SessionStateBehavior GetControllerSessionBehavior(System.Web.Routing.RequestContext requestContext, string controllerName)
       {
            return SessionStateBehavior.Default;
       } 

       public void ReleaseController(IController controller)
       {
           var disposable = controller as IDisposable;
           if (disposable != null)
           {
               disposable.Dispose();
           } 

       }
   }

You can implement the CreateController() method with a more generic way, using reflection.

public class CustomControllerFactory : IControllerFactory
{
    private readonly string _controllerNamespace;
    public CustomControllerFactory(string controllerNamespace)
    {
        _controllerNamespace = controllerNamespace;
    }
    public IController CreateController(System.Web.Routing.RequestContext requestContext, string controllerName)
    {
        ILogger logger = new DefaultLogger();
        Type controllerType = Type.GetType(string.Concat(_controllerNamespace, ".", controllerName, "Controller"));
        IController controller = Activator.CreateInstance(controllerType, new[] { logger }) as Controller;
        return controller;
    }
}

Set your controller factory in Application_Start by using SetControllerFactory method:

protected void Application_Start()
       {
           AreaRegistration.RegisterAllAreas();
           GlobalConfiguration.Configure(WebApiConfig.Register);
           FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
           RouteConfig.RegisterRoutes(RouteTable.Routes);
           BundleConfig.RegisterBundles(BundleTable.Bundles); 

           ControllerBuilder.Current.SetControllerFactory(typeof(CustomControllerFactory));
       }

This could be one of the objective of the Microsoft certification exam 70-486, more specific for “Develop the user experience”, sub-objective “Control application behavior by using MVC extensibility points”.

Hope this helped you to understand how to do dependency injection in controllers with MVC.

 

 



SQL 2014 Hosting - ASPHostPortal :: Make Your SSAS Works Like a Private Jet!

clock December 10, 2015 19:54 by author Jervis

SQL Server Analysis Services (SSAS) Tabular is a popular choice as an analytical engine for many customers. With its state-of-the-art compression algorithms, multi-threaded query processor and in-memory capabilities, SSAS Tabular can provide super quick access to data by reporting client applications. However, as a consultant, I have been called by many clients to resolve slow query performance when accessing data from SSAS Tabular models. My experiences have taught me most, if not all, of the performance issues can be resolved by taking care of the following five subject areas. 

Estimate Current Size and Growth Carefully

Tabular models compress data really well and on an average, you can expect to see 10x the compression rates (though it can be much more or less depending on the cardinality of your data). However, when you are estimating the size of your model as well as future growth, a rough figure like this is not going to be optimal. If you already have data sitting in a data warehouse, import a subset of that data — say a month — to find the model size for that and then extrapolate the value based on the required number of rows / years as well as the number of columns. If not, try to at least get a subset of the data from the source to find the model size. There are tools like BISM Memory Report and Vertipaq Analyzer that can further help in this process.

It is also important to record the number of users who will be accessing the system currently as well as the estimated growth for the number of users.

Select or Upgrade Hardware Appropriately

Everyone knows SSAS Tabular is a memory intensive application, and one major issue I have seen is only the RAM is considered when hardware selections are made. However, just focusing on the RAM is not enough and there are a lot of other variables. Suppose all the other variables are constant and there is an unlimited budget, these are the recommendations:

CPU Speed – The faster, the better, will help in computing results faster especially when there is a bottleneck on the single-threaded formula engine.

CPU Cores – In theory, the more the better as it helps in managing concurrent user loads. However, in reality, a rise in the number of cores usually corresponds to a decrease in the CPU speed due to management overload. So a balanced approach has to be taken when determining the CPU Cores and Speed. Also, licensing cost increases with the number of cores for many software.

CPU Sockets – The lesser, the better as SSAS Tabular is not NUMA aware till SQL 2014. However, this is expected to change in SQL 2016 where some NUMA optimization has been made. For large tabular models, it might be a challenge to go single socket as the amount of RAM that can be supported on a system will depend on the CPU sockets.

CPU Cache – The more, the better. Retrieving data from CPU caches are 10-100x faster than retrieving data from RAM.

CPU Architecture – The newer, the better due to the hardware performance optimizations. For eg, Intel Xeon processors with Haswell architecture is always going to be faster than Sandy architecture keeping all other variables constant.

Amount of RAM – Should have at least 2.5x the model size, if the model is going to be processed on the same server. The amount of RAM can be lesser in cases of certain scale out architectures where the model is processed in a separate server.

RAM Speed – The faster, the better (yes, RAMs have speed too!) This is very important for a memory-bound application like Tabular and should always go for the faster speeds, if budget allows.

Storage – Not important at all as it does not have any effect on query performance. However, if budget allows, it might not be a bad idea to get faster storage like SSDs, as that will help in maintenance related activities like backup, storage or even getting the tabular model online faster when the service is restarted. Apart from this, there are other factors also like network latency, server architecture (scale out), etc that have to be considered, but depending on the budget and specific customer requirements, a balanced approach will have to be made.

Design the Data Model Properly

Tabular is really good at performance and in the case of small models, is extremely forgiving in terms of bad design. However, when the amount of data grows, performance problems begin to show up. In theory, you will get the best performance in SSAS tabular if the entire data is flattened into a single table. However, in reality, this would translate to an extremely bad user experience as well as a lengthy and expensive ETL process. So the best practice is to have a star schema, generally. Also, it is recommended to only include the relevant columns from the source tables, as increasing the columns will result in an increase in model size which in turn will result in slower query performances. Increase in number of rows might still be ok as long as the cardinality of the columns don’t change much.

Depending on the specific customer requirements, there could be deviations from the best practices. For e.g., we built custom aggregate tables along with the detailed fact table in the case of a very large production model for a client. The resultant measure had a conditional statement to retrieve data from the aggregate table if the detailed level dimension data was not used in the report. Since the aggregate table was only 1/10 the size of the detailed fact table, the query came out 10x times faster whenever the details were not used, which was almost 90% of the times.

Optimize the DAX Calculations

In case of small models, Tabular is extremely forgiving in terms of bad DAX code also. However, just like in the case of bad design, performance takes a hit for the worse as you increase the data, add more users, or run complex queries. DAX performance tuning is the most difficult to tune from the current list, and it is important to have a strategy for maintaining and tuning the performance. A good place to start would be the Performance Tuning of Tabular models in SSAS 2012 whitepaper.

Monitor User Query Patterns and Train Users

Once your model is in production, it is important to keep monitoring the user query patterns as well as the resources to see potential bottlenecks. Through this, you can find whether the performance issues are being caused due to inefficient DAX, bad design, insufficient resources or most importantly, whether it is just because a user is using the model inefficiently. For e.g., in one of the cases, we found out the slow performance for all users was due to a single user dumping the entire 100 GB model into spreadsheets so he could perform custom calculations on top of it. This blocked the queries for all the other users and made things really slow for them. With appropriate requirement gatherings, we ensured all the required calculations for that user were there in the model and then trained the user to use the model for his analytics.

The success of any tabular project depends on the adoption by the end users and it is needless to say the adoption would be much better if the system is fast. These 5 tips will ensure you already have a jumpstart on that journey.

Looking to Use SQL Server Analysis Services Hosting?

As this is intermediate service, you can get this SQL Server Analysis Services on our Windows Dedicated server plan. You can start from our Bundled Cloud Platinum Class to get this service running on your server. This plan has included

- Windows Server 2008/2012 license
- 2 x 2.0 Ghz Core
- 4 GB RAM --> FREE Upgrade to 8 GB RAM (Use Promo Code ‘DOUBLERAM’)
- 2 x 500 GB Storage
- 20000 GB Bandwith
- 1000 Mbps Connection speed
- 4 Static IP
- Full 24 x 7 RDP Access
- Full 24/7 Firewall Protection
- Standard ver. SQL 2008/2012
- Support MySQL db
- PLESK Control Panel
- Antivirus
- Unlimited of MSSQL dbs
- Unlimited of MySQL dbs
- FREE SmarterMail service if you register this December

Get a HUGE discount for this Christmas!! For more information, please visit our official site at http://www.asphostportal.com.



SQL 2014 Hosting - ASPHostPortal :: How to Optimize Your SQL Query

clock October 6, 2015 08:59 by author Jervis

Modern day software applications have millions of concurrent users. Development of an efficiently serviceable application requires a huge amount of effort and requires many tools and techniques. Software developers always try to improve the performance of the application by improving design, coding and database development. For database development, query optimization and evaluation techniques are playing vital parts.

Selection of required field only.

It is very important to avoid unnecessary data selection of the query. We should select a data field that we need but not all fields of the table.

SELECT login_id, pawwsord FROM tbluser  

Index

Properly created Indexes help to optimize search results. You need to better understand the databases before the selection of a better performing index. The selection of a highly used field as an index is very important.

CREATE clustered INDEX ind_login_id ON tbluser(login_id)  

Primary Key

The Primary Key is the most important index of the table. The most important thing about a Primary Key is the selection of a short and unique field. This will lead to easy access to the data records.

CREATE TABLE tbluser(
  id INT,  
  name VARCHAR(150),  
  email VARCHAR(100),  
  login_id VARCHAR(100),  
  password VARCHAR(10),  
  primary_key(id)  
)

Index unique column

The indexing of a unique column will improve searching and increase the efficiency of the database. You must have a better understanding of the data field and their utilization before indexing a unique column. The indexing of a less used column does not help improve the efficiency of the database.

CREATE INDEX ind_email ON tbluser(email)  

Select limited records

None of the user interfaces can visualize thousands of records at once. Hence there is no way to select all the records at once, so always limit the selection when you have a large number of records. Select the required data only.

SELECT id, name, email, login_id,password FROM tbluser WHERE 1 limite 10  

Selection of correct data type and length

Use the most appropriate data type and correct length of the data. The bad selection of a data type will produce bulky databases and poor performance. This will improve resource utilization of the database server.

CREATE TABLE tbluser(id INT,  
   name VARCHAR(150),  
   email VARCHAR(100),  
   login_id VARCHAR(100),  
   password VARCHAR(10)  
)  

Avoid in sub query

Always avoid use of IN sub-queries in your applications. An In sub-query will evaluate all the records of table A with table B (product of records) before selecting the required data.

SELECT login_id,name, email FROM tbluser WHERE login_id IN ( SELECT login_id FROM tbllogin_details)

One of the correct ways is to use an inner join as in the following:  

SELECT login_id,name, email FROM tbluser INNER JOIN tbllogin_details ON tbluser.login_id =tbllogin_details.login_id 

Avoid NOT operator

Please avoid the usage of the NOT operator situation that the number of qualifying records are lower than unqualified records. Always use a positive operator such as LIKE, EXIST than NOT LIKE, NOT EXIST.

SELECT * FROM tbluser WHERE email NOT LIKE '%gmail%'  

The prefered way is:

SELECT * FROM tbluser WHERE email LIKE '%yahoo%'  



SQL Hosting with ASPHostPortal :: Using SQLBulkCopy and C# to Upload File

clock August 12, 2015 08:17 by author Jervis

In this article I am going to write about SQLBulkCopy and its major properties and methods. This article will give you the code for high performance transfer of rows from XML file to SQL server with SQLBulkCopy and C#.

SQLBulkCopy introduced as part of .Net framework 2.0. It is simple and easy tool to transfer complicated or simple data from one data source to other. You can read data from any data source as long as that data can be load to DataTable or read by IDataReader and transfer the data with high performance to SQL Server using SQLBulkCopy.

In real time applications every day millions of records get transferred from one data store to other. There are multiple ways to transfer the data like command prompt bcp utility of SQL Server, creating INSERT statements, creating SSIS packages and SQLBulkCopy. SQLBulkCopy gives you significant performance gain over other tools.

SQLBulkCopy Constructor

SQLBulkCopy initializes instance in four different way.

1. Accepts already open SqlConnection for destination.
2. Accepts connection string of SQLConnection. This constructor actually opens and initializes new instance of SQLConnection for destination.
3. Accepts connection string of SQLconnection and enum value of SqlBulkCopyOptions. This constructor actually opens and initializes new instance of SQLConnection for destination.
4. Accepts already opened SQLConnection and enum value of SqlBulkCopyOptions.

SqlBulkCopy bulkCopy =
            new SqlBulkCopy(destinationConnection.ConnectionString, 
                SqlBulkCopyOptions.TableLock))

BatchSize

SQLBulkCopy BatchSize is integer property with default value of 0. It decides how many rows need to be send to the server in one batch. If you do not set any value for this property or set it as 0, all the records will be send in single batch.

Following example sets BatchSize property as 50.

bulkCopy.BatchSize = 50;

ColumnMappings

SQLBulkCopy ColumnMappings is a collection of columns which needs to be map from source table to destination table's columns. You do not need to map the columns if column names are same. However it is very important to map the columns if column names are different. If matching SQLBulkCopy does not found the matching column it throws System.InvalidOperationException.

You can map the columns in different ways, giving both column names is easy and readable method.

Below code match the column OrderID from source table with columnNewOrderID of destination column.

bulkCopy.ColumnMappings.Add("OrderID", "NewOrderID");  

Data Type issue while mapping the column

SqlBulkCopy is particular about matching column DataType. Both the columns has to be of same DataType. If you have nullable columns, you explicitly have to convert such columns into desired DataType.

Below code converts Null to varchar(2) and can be mapped to any varchar(2) column of destination table.

SELECT  CAST(ISNULL(ShipRegion,'') as varchar(2))
            as ShipRegion FROM Orders

Quick note: If you are having computed columns like SUM, AVG etc. make sure it returns in expected DataType. If your destination table expects columns with decimal(15,7) you will have to explicitly convert the source column as decimal(15,7) because SUM will by default return decimal(38,7).

DestinationTableName

It sets the name of destination table. The method WriteToServer will copy the source rows to this particular table.

Below code will set the destination table as "TopOrders".

bulkCopy.DestinationTableName = "TopOrders";   

NotifyAfter and SqlRowsCopied

NotifyAfter is an integer property with default value of 0 and SqlRowsCopied is an event. The value of NotifyAfter indicates when to raise eventSqlRowsCopied.

The below code shows after processing 100 rows, event SqlRowsCopied will be executed.

bulkCopy.SqlRowsCopied +=
    new SqlRowsCopiedEventHandler(OnSqlRowsTransfer);
bulkCopy.NotifyAfter = 100;

private static void
    OnSqlRowsTransfer(object sender, SqlRowsCopiedEventArgs e)
{
        Console.WriteLine("Copied {0} so far...", e.RowsCopied);
}

WriteToServer

WriteToServer is a method which actually processes your source table data to destination table. It accepts array of DataRows or DataTable or IDataReader. With DataTable you can also specify the state of the rows that needs to be processed.

The following code will process rows from sourceData DataTable which has RowState as Added to DestinationTable.

bulkCopy.WriteToServer(sourceData, DataRowState.Added);



ASP.NET MVC 4 Hosting - ASPHostPortal :: How to Fix Could not load file or assembly DotNetOpenAuth.Core, Version=4.0.0.0

clock September 8, 2014 09:33 by author Jervis

We have seen that many of our clients experience this problem and we have also found this issue on many forums. So, we decide to write the solution on this tutorial and hope it will fix your problem.

Error Message

Could not load file or assembly 'DotNetOpenAuth.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=2780ccd10d57b246' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)

The Problem

The application was actually upgraded earlier from ASP.NET MVC 3, i.e. it was not autogenerated by Visual Studio 2012. The pre-binding info in the exception was:

=== Pre-bind state information ===
LOG: User = NT AUTHORITY\NETWORK SERVICE
LOG: DisplayName = DotNetOpenAuth.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=2780ccd10d57b246
 (Fully-specified)
LOG: Appbase = file:///
LOG: Initial PrivatePath = \bin
Calling assembly : Microsoft.Web.WebPages.OAuth, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35.
===
LOG: This bind starts in default load context.
LOG: Using application configuration file: \web.config
LOG: Using host configuration file: C:\Windows\Microsoft.NET\Framework64\v4.0.30319\aspnet.config
LOG: Using machine configuration file from C:\Windows\Microsoft.NET\Framework64\v4.0.30319\config\machine.config.
LOG: Post-policy reference: DotNetOpenAuth.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=2780ccd10d57b246
LOG: Attempting download of new URL file:////DotNetOpenAuth.Core.DLL.
LOG: Attempting download of new URL file:////DotNetOpenAuth.Core/DotNetOpenAuth.Core.DLL.
LOG: Attempting download of new URL file:////DotNetOpenAuth.Core.DLL.
WRN: Comparing the assembly name resulted in the mismatch: Minor Version
ERR: Failed to complete setup of assembly (hr = 0x80131040). Probing terminated.

The key pieces of information here are at lines 3 and 7. Basically, Microsoft.Web.WebPages.OAuth needs DotNetOpenAuth.Core 4.0.0.0. Please check your DotNetOpenAuth.Core version again. Basically the problem from the different version.

Solution

1. Add these lines under the <runtime>/<assemblyBinding> section of the root Web.config:

<dependentAssembly>
                <assemblyIdentity name="DotNetOpenAuth.AspNet" publicKeyToken="2780ccd10d57b246" culture="neutral" />
                <bindingRedirect oldVersion="0.0.0.0-4.3.0.0" newVersion="4.3.0.0" />
</dependentAssembly>
<dependentAssembly>
                <assemblyIdentity name="DotNetOpenAuth.Core" publicKeyToken="2780ccd10d57b246" culture="neutral" />
                <bindingRedirect oldVersion="0.0.0.0-4.3.0.0" newVersion="4.3.0.0" />
</dependentAssembly>

2. The above solution works for other packages that ASP.NET MVC 4 depends on. For example, if you upgrade WebGrease from 1.0.0.0 to 1.3.0.0, you have to add this to the <runtime>/<assemblyBinding> section:

<dependentAssembly>
                <assemblyIdentity name="WebGrease" publicKeyToken="31bf3856ad364e35" culture="neutral" />
                <bindingRedirect oldVersion="0.0.0.0-1.3.0.0" newVersion="1.3.0.0" />
</dependentAssembly>

 

 



Windows Server 2008/2012 Hosting :: Unmanaged Dedicated Hosting vs Managed Dedicated Hosting

clock August 6, 2014 08:34 by author Jervis

We believe that you must very familiar with this word “Managed Hosting and Unmanaged Hosting”. What is it about? When you purchase VPS/cloud/dedicated server, you will see 2 option, do you want to use managed or unmanaged hosting. In this post, we will discuss differences between Managed Hosting & Unmanaged Hosting as well as advantages and disadvantages of both. We will more focus in managed hosting.

When you’re buying a house and you have two options. You can buy a home with no strings attached, so that YOU are responsible for cleaning, landscaping, security, and maintenance; OR, you can pay a little extra and have a self-sustaining house that completes all those tasks for you. You can purchase a home that has automatic security and fire protection built in–one that cleans itself, protects itself and magically keeps the lawn and flower beds looking great without any effort on your part. Sounds nice, right?

Well, there’s a chance you might be thinking: Why would I pay extra if I know how to lock the doors, put out a fire, mop the floors, mow the lawn and plant flowers on my own?

Now, imagine you don’t know how to do ANY

of those tasks: If you have no knowledge of how to manage a home, your house will be much better off in the long run if you purchase the fully managed, self-sustaining option. This scenario is much like what small business owners face when confronted with the hosting industry. Often they don’t have the tools or know-how to manage a server on their own, and a fully managed hosting plan can help.

It is same with hosting business. If you choose unmanaged hosting, you simply rent a server from a hosting provider, and the responsibility of managing that server falls on whoever has the most IT experience at a small business.

Well, managed hosting basically provides your small business with an exclusive, out-of-the-office IT department. This type of hosting plan means that a perfectly capable team professionals monitors your dedicated server 24/7/365, keeping your site performing at it’s best at all times. If you don’t have experience to manage the server, then you need to choose this managed service. Same like our description above. Just imagine it. 

Technical Knowledge

There is no doubt on the fact that technical knowledge is a requirement when it comes to managing dedicated and others among web hosting services. However, requirement of technical knowledge is different in managed and unmanaged hosting. The first option does not need you to be an expert in server management and its sections, as one team from your web-hosting provider will be there to manage your cloud or dedicated server. Although the full list may be quite long, main responsibilities of team include server configuration, maintenance, security, sufficient upgrades, firewall services, backup services etc. In other words, once you have purchased managed web server, you do not have to be concerned about managing the technical aspect of your server. So, when you want to manage your website, you can log into the control panel and do things as you do in shared web servers.

Nevertheless, unmanaged web hosting is quite opposite in this aspect! Things that come with an unmanaged dedicated are the physical server itself and initial installation procedure. This also means that you will have to do everything else except these. The list of tasks includes software upgrades, security, frequent maintenance, etc. Thus, an unmanaged dedicated needs you to be an expert in terms of server management, in its all aspects. So, we would say that you should go for managed hosting if you want to skip hassles of server management.

Security System

When it comes to the case of security of web server, managed dedicated grabs the first price, because the team behind the web host will be more than enough to keep your server and websites secure from almost every possible threat. But, unmanaged server cannot, even if you have hired an effective team for maintenance, guarantee that much security. So, when you want to be less concerned about security section of your web server, you should choose managed dedicated with no doubt.

What is Benefits of Managed Hosting:

Security:

Hosting providers have the tools and experience to monitor and handle your security issues properly: they filter spam, scan for viruses, run security audits, configure software firewalls, and provide OS updates.

Backups:

Lost data can be detrimental and expensive for a company. In many cases a natural disaster is to blame for a significant loss of data, but a hosting provider has procedures to deal with data loss and provide secure, backed up data after such a situation occurs.

Monitoring Server:

This is the process of constantly scanning the server for any problems or irregularities, and it’s crucial for your small business. System admins will ensure network availability for your clients by constant monitoring.

Managed Storage & Database:

System admins will save you money by managing your storage needs with the help of a managed services provider, so that the amount of space needed is managed properly and scaled properly. Managing of the database is more complicated, and it’s conducted by a database admin that can effectively design the best database to meet your company’s specific needs and requirements.

Pricing

Indeed, pricing is a significant factor while choosing between managed hosting and unmanaged hosting! As you may have guessed, managed web servers cost more than unmanaged web server, as such, servers offer assistance as well as support. However, if you do not have enough technical knowledge about web servers, purchasing an unmanaged web server means purchase of a team that is capable of maintaining the server in its optimum functional condition.

Conclusion

Whether you choose unmanaged and managed hosting, you don’t need to worry to spend a lot of money as we have affordable windows cloud dedicated server that we believe it will suits your requirement. Please visit cheap cloud dedicated server price on our site. Our cheap windows cloud dedicated server start from only $18.00/month. You can choose managed or unmanaged service.



SQL 2014 Hosting - ASPHostPortal.com :: Introduction In-Memory OLTP in SQL 2014

clock July 4, 2014 06:49 by author Jervis

Microsoft's new release of SQL Server 2014 comes pretty close on the heels of the last SQL Server 2012 release. For many organizations, this could be a hurdle to adoption, because upgrading core pieces of an IT infrastructure can be both costly and resource-intensive. However, SQL Server 2014 has several compelling new features that can definitely justify an upgrade. Here are the overview of SQL 2014 features:

1. New In-Memory OLTP Engine
2. Enhanced Windows Server 2012 Integration
3. Improvement in Business Intelligence
4. Office 365 Integration
5. Etc

In today post, I want to examine SQL Server 2014 In-Memory OLTP from different angles: how to start using it, provide directions for migration planning, review closely many of its limitations, discuss SQL 2014 In-Memory OLTP applicability and see where the SQL Server In-Memory OLTP can be an alternative to in-memory dynamic caching, and where it is complimentary.

The question now is what is Memory Online Transaction Processing?

SQL Server 2014’s biggest feature is definitely its In-Memory transaction processing, or in-memory OLTP, which Microsoft claims make database operations much faster. In-memory database technology for SQL Server has long been in the works under the code name “Hekaton”. Hekaton is a database engine component which is optimized for accessing Memory resident tables. This component is great, it is fully integrated into SQL 2014 database engine.

The Function of Memory Online Transaction Processing

Hekaton facilitates creation of Memory resident Tables (i.e. Memory Optimized Tables) and Indexes. Beside that, it also provide the option to compile the Transact-Sql Stored Procedure accessing Memory Optimized Tables to Machine code. With Memory Optimized Tables, it provide better performance as the core engine uses the lock free algorithm which doesn’t require any lock and latches when the Memory optimized tables are referenced during the transaction processing.

Low Cost Using In Memory OLTP

Sql Server database engine was designed in the days when Main Memory was very costly. As per this design data is stored on the disk and is loaded to the main memory as required for the transaction processing and any changes to the In-Memory data is written back to the disk. This disk IO is main bottle neck for the OLTP applications having huge number of concurrent users, as it involves waiting for locks to be released, latches to be available, waiting for the log writes to complete.

As per the current trend Main Memory prices are less expensive and enterprises can easily afford to have production database servers with Main Memory sizes in TB’s. And this declining Memory prices made Microsoft to re-think on the initial database engine which is designed in the days when Main Memory was costly. And the result of this re-think is the In-Memory OLTP (a.k.a. Hekaton) Database engine component which supports memory resident Tables and Index. In-Memory OLTP engine uses the lock free algorithm (i.e. MultiVersion Optimistic Concurrency Control) which doesn’t require any lock and latches when the Memory optimized tables are referenced during the transaction processing. And for supporting data durability it still writes to the transaction log but the amount of data which is written to the log is reduced considerably.

Migrating to SQL Server 2014 In-Memory OLTP

Migration to In-Memory OLTP has to be performed in a development environment and carefully tested. Your High-Availability design, Databases design, Tables schemas and data, stored procedures, business logic in the database and even application code – all may require many syntax changes to use In-Memory OLTP.

This is not a “click and migrate” process. It requires development cycles, application and database design and code changes.

The right way to use the In-Memory OLTP engine is:

  • Plan your production database architecture. The In-Memory OLTP is very different and has many limitations in terms of H/A, Mirroring, Replications available functionalities;
  • Plan carefully your new database (and possibly application) design;
  • Migrate several specific tables and procedures that are good benefit candidates;
  • Develop or change your business-logic to fit the new design;
  • Test and evaluate;
  • Deploy

To evaluate whether the In-Memory OLTP can improve your database performance, you can use Microsoft new AMR tool (Analysis, Migrate and Report). For helping with actual migration you can use the Memory Optimization Advisor for tables and the Native Compilation Advisor to help porting a stored procedure to a natively compiled stored procedure.

The AMR tool helps identifying the tables and stored procedures that would benefit by moving them into memory and also help performing the actual migration of those database objects. The AMR tool is installed when you select the “Complete” option of “Management Tools”, and is later accessed through SQL Server Management Studio (SSMS) in Reports  –>> Management Data Warehouse Transaction performance reports tool:

The AMR tool provides reports which tables and procedures can benefit the most from In-Memory OLTP and provide a hint how complex will be the conversion. The reports show either recommendations based on usage, contention and performance. Here is example (graphics may change in the GA release):

After you identify a table that you would like to port to use In-Memory OLTP, you can use the Memory-Optimization Advisor to help you migrate the disk-based database table to In-Memory OLTP. In SSMS Object Explorer, right click the table you want to convert, and select Memory-Optimization Advisor.

Conclusion

Above article is only brief information about one of new feature in SQL 2014. Want to try more? We have supported the latest SQL 2014 hosting on our hosting environment. Just take a look on our site for more information.



Free Trial Web Deploy 3.5 Hosting :: ASPHostPortal.com Proudly Launches Web Deploy 3.5 Hosting

clock November 1, 2013 10:34 by author Mike

Professional web hosting provider – ASPHostPortal.com proudly announced the integration of the latest Web Deploy 3.5 in all  web hosting plans. We are the first few hosting company that provide ASP.NET hosting plan that support the brand new Web  Deploy 3.5 Hosting.


WebDeploy v3.5 is now available and there are a few features to consider in this minor release:

  1. Load Balancer Support with Session Affinity. 
  2. Encrypting web.config Settings Post-Publish.
  3. App Offline Template.
  4. Seamless integration with IIS Manager (IIS7 and above), Visual Studio (2010 and above) for creating packages and  deploying them onto a machine, both locally and remotely.
  5. Integration with WebMatrix for deploying and downloading web applications.
  6. Seamless integration with the Web Platform Installer to install community web applications simply and easily.
  7. Web application packaging and deployment.
  8. Web server migration and synchronization.
  9. Automatic backup of Web Sites before making any changes.
  10. In addition to the IIS Manager, Visual Studio 10, Web Matrix tasks can be performed using the command-line, PowerShell  Cmdlets or public APIs.

According to ASPHostPortal.com, it's Web Deploy 3.5 offerings are distinguished by their low cost, with many of the hosting  services supporting the technology being of the more expensive variety.
For more information about this topics or have any enquiries related to Web Deploy 3.5 hosting, please visit  http://www.asphostportal.com

About ASPHostPortal.com:
ASPHostPortal.com is a hosting company that best support in Windows and ASP.NET-based hosting. Services include shared  hosting, reseller hosting, and sharepoint hosting, with specialty in ASP.NET, SQL Server, and architecting highly scalable  solutions. As a leading small to mid-sized business web hosting provider, ASPHostPortal strive to offer the most  technologically advanced hosting solutions available to all customers across the world. Security, reliability, and  performance are at the core of hosting operations to ensure each site and/or application hosted is highly secured and  performs at optimum level.



ASP.NET Hosting - ASPHostPortal :: How to Resolve "~/Telerik.Web.UI.WebResource.axd' is missing in web.config"

clock September 13, 2013 08:23 by author Mike

Problem 1:
When attempting to run an application that uses certain Telerik controls, you may run into the error below:

System.InvalidOperationException:
‘~/Telerik.Web.UI.WebResource.axd’ is missing in web.config.
RadStyleSheetManager requires a HttpHandler registration in web.config. Please,
use the control Smart Tag to add the handler automatically, or see the help for
more information: Controls > RadStyleSheetManager at
Telerik.Web.UI.RadStyleSheetManager.OnPreRender(EventArgs e) at
System.Web.UI.Control.PreRenderRecursiveInternal() at
System.Web.UI.Control.PreRenderRecursiveInternal() at
System.Web.UI.Control.PreRenderRecursiveInternal() at
System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint,
Boolean includeStagesAfterAsyncPoint)


Solution:
I’ve seen that error on new DotNetNuke installations but it certainly isn’t specific to DNN, nor is it a problem with the Telerik controls. The issue is that the Integrated Pipeline Mode and Classic Modes handle the registration slightly differently within the web.config file.
To resolve this error you can make changes to your web.config file, or, a perhaps faster and easier solution, is to change your site’s ASP.NET version to Classic Mode – both seem to work equally well.

Problem 2:
When you place RadScriptManager on aspx page and compile it gives error like, 

" '~/Telerik.Web.UI.WebResource.axd' is missing in web.config. RadScriptManager requires a HttpHandler registration in web.config. 
Please, use the control Smart Tag to add the handler automatically, or see the help for more information: Controls > RadScriptManager "


Solution:
Its Simple And fast you Just Have to register The radScriptmanager and it will modify the web config file automatically.

  • Click RadScriptManager Menu Cursor .It Will Display RadScript Manager Tasks click on Regiter Telerik.Web>UI.WebResource.axd
Telerik.Web.UI.WebResource.axd
  • Telerik.WebUI.WebResources.axd Is Now Succeessfully Register in Web Config.

    Telerik.Web.UI.WebResource.axd



ASP.NET 4.0 Hosting :: Multiselect Dropdown for Web Applications ASP.NET 4.0

clock September 4, 2013 11:18 by author Mike

In this article, I will show you how to create a multiselect in an ASP.NET 4.0 page. I will keep this article short and sweet so you can just use the code in your applications. Lets see how I implement this.

Now, place one textbox in your design view and assign the popup extender for that textbox. Important note here is you must be place your textbox control within the "Updatepanel". Afterwards, you create one panel and within this panel you put one checkboxlist. You assign this panel to the "PopupControlID" property of the Popup Extender.

Now see this code:

Finally we must do only one modification in code behind, because each time clicking the checkboxlist that selected value will added into the textbox control. So in selected index changed event you will place this code.


Finish. Hopefully this article can help you.



About ASPHostPortal.com

We’re a company that works differently to most. Value is what we output and help our customers achieve, not how much money we put in the bank. It’s not because we are altruistic. It’s based on an even simpler principle. "Do good things, and good things will come to you".

Success for us is something that is continually experienced, not something that is reached. For us it is all about the experience – more than the journey. Life is a continual experience. We see the Internet as being an incredible amplifier to the experience of life for all of us. It can help humanity come together to explode in knowledge exploration and discussion. It is continual enlightenment of new ideas, experiences, and passions


Author Link


Corporate Address (Location)

ASPHostPortal
170 W 56th Street, Suite 121
New York, NY 10019
United States

Sign in