Windows 2012 Hosting - MVC 6 and SQL 2014 BLOG

Tutorial and Articles about Windows Hosting, SQL Hosting, MVC Hosting, and Silverlight Hosting

ASP.NET Hosting - ASPHostPortal.com :: How to Prevent ASP.NET Website From Multiple Database Request

clock October 26, 2015 20:44 by author Dan

It is not good to execute multiple db request for loading single page.  Review your database code to see if you have request paths that go to the database more than once. Each of those round-trips decreases the number of requests per second your application can serve. By returning multiple resultsets in a single database request, you can cut the total time spent communicating with the database.

In order to improve performance you should execute single stored proc and bring multiple resultset in to single db request.  In this article i will explain you how to avoid multiple database request and how to bring multiple resultset into single db request.

Consider a scenario of loading a Product Page, which displays

  • Product Information and
  • Product Review Information

In order to bring 2 database request in single db request, your sql server stored proc should be declared as below.

SQL Server Stored Proc

CREATE PROCEDURE GetProductDetails
 @ProductId bigint,
AS
SET NOCOUNT ON

--Product Information
Select ProductId,
 ProductName,
 ProductImage,
 Description,
 Price
From Product
Where ProductId = @ProductId

--Product Review Information
Select  ReviewerName,
 ReviewDesc,
 ReviewDate
From ProductReview
Where ProductId = @ProductId

Asp.net, C# Code to bring multiple db request into single db request

Code Inside Data Access Class Library (DAL)

public DataSet GetProductDetails()
{
SqlCommand cmdToExecute = new SqlCommand();
cmdToExecute.CommandText = "GetProductDetails";
cmdToExecute.CommandType = CommandType.StoredProcedure;
DataSet dsResultSet = new DataSet();
SqlDataAdapter adapter = new SqlDataAdapter(cmdToExecute);

try
{
    var conString = System.Configuration.ConfigurationManager.ConnectionStrings["ConnStr"];
    string strConnString = conString.ConnectionString;
    SqlConnection conn = new SqlConnection(strConnString);

    cmdToExecute.Connection = conn;

    cmdToExecute.Parameters.Add(new SqlParameter("@ ProductId", SqlDbType.BigInt, 8, ParameterDirection.Input, false, 19, 0, "", DataRowVersion.Proposed, _productId));

    //Open Connection
    conn.Open();

    // Assign proper name to multiple table
    adapter.TableMappings.Add("Table", "ProductInfo");
    adapter.TableMappings.Add("Table1", "ProductReviewInfo");
    adapter.Fill(dsResultSet);

    return dsResultSet;             
}
catch (Exception ex)
{
    // some error occured.
    throw new Exception("DB Request error.", ex);
}
finally
{
    conn.Close();
    cmdToExecute.Dispose();
    adapter.Dispose();
}
}

Code Inside Asp.net .aspx.cs page

protected void Page_Load(object sender, EventArgs e)
{
   if (Request.QueryString[ProductId] != null)
   {
      long ProductId = Convert.ToInt64(Request.QueryString[ProductId].ToString()); 
  
      DataSet dsData = new DataSet();

      //Assuming you have Product class in DAL
      ProductInfo objProduct = new ProductInfo();
      objProduct.ProductId = ProductId;
      dsData = objProduct.GetProductDetails();

      DataTable dtProductInfo = dsData.Tables["ProductInfo"];
      DataTable dtProductReviews = dsData.Tables["ProductReviewInfo"];

      //Now you have data table containing information
      //Make necessary assignment to controls
      .....
      .....
      .....
      .....
      ..... 

    }
}


Finish, Happy coding.

Best Windows Shared Hosting Recommendation

ASPHostPortal.com provides our customers with Plesk Panel, one of the most popular and stable control panels for Windows hosting, as free. You could also see the latest .NET framework, a crazy amount of functionality as well as Large disk space, bandwidth, MSSQL databases and more. All those give people the convenience to build up a powerful site in Our Shared Hosting. We offers Windows hosting starts from $5/month only. We also guarantees 30 days money back and guarantee 99.9% uptime. If you need a reliable affordable Windows Shared Hosting, we should be your best choice.



SQL Server 2014 Hosting - ASPHostPortal :: How to Solve SQL Cannot Generate SSPI Context

clock May 21, 2015 06:24 by author Dan

Everyone knows that it is good practice to use a domain or service account to run the SQL service. I’m sure you do too! However, once you do the right thing and change the SQL Service account, you may start getting the following error message when attempting to connect to the sql server:

“The target principal name is incorrect.  Cannot generate SSPI context.”

The explanation, as given by Microsoft in this KB article

    If you run the SQL Server service under the LocalSystem account, the SPN is automatically registered and Kerberos authentication interacts successfully with the computer that is running SQL Server. However, if you run the SQL Server service under a domain account or under a local account, the attempt to create the SPN will fail in most cases because the domain account and the local account do not have the right to set their own SPNs. When the SPN creation is not successful, this means that no SPN is set up for the computer that is running SQL Server. If you test by using a domain administrator account as the SQL Server service account, the SPN is successfully created because the domain administrator-level credentials that you must have to create an SPN are present.

There are 3 ways to fix the problem:

    - Revert to using the Network Service or Local System account (NOT RECOMMENDED)
    - Assign the domain account to the Domain Admins group (NOT IDEAL – due to the elevated permissions)
    - Fix the problem by giving the domain account just the appropriate permissions in Active Directory. Permissions required are
        >> ServicePrincipalName: Read
        >> ServicePrincipalName: Write

We will use the 3rd option to fix the error. First, it is good practice to verify that the problem is actually due to permission issues. Log in to the server where you SQL Instance is running. Go to the error logs and look for the last time that the SQL service was restarted. You should find an error message similar to this:

Date                   10/17/2013 9:29:50 AM
Log                    SQL Server (Archive #1 - 10/17/2013 10:53:00 AM)
Source                Server
Message
The SQL Server Network Interface library could not register the Service Principal Name (SPN) [ MSSQLSvc/servername.domainname.net:1433 ] for the SQL Server service. Windows return code: 0x2098, state: 15. Failure to register a SPN might cause integrated authentication to use NTLM instead of Kerberos. This is an informational message. Further action is only required if Kerberos authentication is required by authentication policies and if the SPN has not been manually registered.

This is great. At least now we have verified that the problem is related to the SPN and we are ready to apply the fix.

Log in to the server running your Active Directory service and execute the following steps:

    - Run Adsiedit.msc
   
- In the ADSI Edit snap-in, expand Domain [YourDomainName], expand DC= RootDomainName, expand CN=Users, right-click CN= [YourAccountName, and then click Properties.
   
- In the CN= AccountName Properties dialog box, click the Security tab.
   
- On the Security tab, click Advanced.
   
- In the Advanced Security Settings dialog box, select one (any) of "SELF"'s row
   
- Click Edit, Open Permission Entry dialog box.
   
- Make sure Pricipal is "SELF", Type is "Allow" and "Applied to" is "This Object Only", in Properties section, select the properties below:
        >> Read servicePrincipalName
        >> Write servicePrincipalName

Click OK to apply all changes and exit the ADSI Edit snap-in

Finally, you need to restart the SQL Service(s) that use the account in question.

You can verify that the SPN has been registered successfully upon the restart by going to the SQL Server logs. You should now see an entry similar to this:

Date                   10/17/2013 10:53:58 AM
Log                    SQL Server (Current - 10/17/2013 10:54:00 AM)
Source                Server
Message
The SQL Server Network Interface library successfully registered the Service Principal Name (SPN) [ MSSQLSvc/servername.domainname.net:1433 ] for the SQL Server service.

Connections to SQL Server should now succeed.

Best SQL Server 2014 Hosting Recommendation

ASPHostPortal.com provides our customers with Plesk Panel, one of the most popular and stable control panels for Windows hosting, as free. You could also see the latest .NET framework, a crazy amount of functionality as well as Large disk space, bandwidth, MSSQL databases and more. All those give people the convenience to build up a powerful site in SQL Server 2014 . We offers Windows hosting starts from $5/month only. We also guarantees 30 days money back and guarantee 99.9% uptime. If you need a reliable affordable SQL SERVER 2014 Hosting, we should be your best choice.



SQL Server 2014 Hosting - ASPHostPortal :: Identify ErrorLog with xp_ReadErrorLog

clock May 7, 2015 07:14 by author Dan

To read error logs in SQL Server using T-SQL you can use extended stored procedure xp_ReadErrorLog to read SQL Server and SQL Server Agent error logs. xp_ReadErrorLog has seven parameters that can be used to filter error logs.

Syntax for xp_ReadErrorLog:

EXEC xp_ReadErrorLog   <LogNumber>, <LogType>,

<SearchTerm1>, <SearchTerm2>,

<StartDate>, <EndDate>, <SortOrder>

The parameter values can be as follows:

You can use the stored procedure as:

EXEC xp_ReadErrorLog

– Reads current SQL Server error log

Below are some more examples of xp_ReadErrorLog:

EXEC xp_ReadErrorLog 1
– Reads SQL Server error log from ERRORLOG.1 file

EXEC xp_ReadErrorLog 0, 1
– Reads current SQL Server error log

EXEC xp_ReadErrorLog 0, 2
– Reads current SQL Server Agent error log

EXEC xp_ReadErrorLog 0, 1, 'Failed'
– Reads current SQL Server error log with text 'Failed'

EXEC xp_ReadErrorLog 0, 1, 'Failed', 'Login'
– Reads current SQL Server error log with text ‘Failed’ AND 'Login'

EXEC xp_ReadErrorLog 0, 1, 'Failed', 'Login', '20121101', NULL
– Reads current SQL Server error log with text ‘Failed’ AND ‘Login’ from 01-Nov-2012

EXEC xp_ReadErrorLog 0, 1, 'Failed', 'Login', '20121101', '20121130'
– Reads current SQL Server error log with text ‘Failed’ AND ‘Login’ between 01-Nov-2012 and 30-Nov-2012

EXEC xp_ReadErrorLog 0, 1, NULL, NULL, '20121101', '20121130'
– Reads current SQL Server error between 01-Nov-2012 and 30-Nov-2012

EXEC xp_ReadErrorLog 0, 1, NULL, NULL, '20121101', '20121130', 'DESC'
– Reads current SQL Server error log between 01-Nov-2012 and 30-Nov-2012 and sorts in descending order

Hope This Helps!

Best SQL Server 2014 Hosting Recommendation

ASPHostPortal.com provides our customers with Plesk Panel, one of the most popular and stable control panels for Windows hosting, as free. You could also see the latest .NET framework, a crazy amount of functionality as well as Large disk space, bandwidth, MSSQL databases and more. All those give people the convenience to build up a powerful site in SQL Server 2014 . We offers Windows hosting starts from $5/month only. We also guarantees 30 days money back and guarantee 99.9% uptime. If you need a reliable affordable SQL SERVER 2014 Hosting, we should be your best choice.



Windows 2008 Hosting with Free ASP.NET :: 9 Steps to Solve Login Failed Error : 18456 in Microsoft SQL Server 2008

clock April 23, 2015 07:31 by author Dan

In this posting, I'll coach you on how you can figure out what user features administrative legal rights within the server along with walk you Step-By-Step how you can configure SQL Server permitting some other end users to sign in as Administrators to SQL Server. This simple information “Login Failed for User (Microsoft SQL Server, Error: 18456)” indicates anyone came into unacceptable recommendations as soon as logging into SQL Server. Inside the beneath display screen chance, We are logging into 'microsoft' SQL Server Managing Facilities which has a user that will not have administrative permissions for connecting to the server.

You must to decide what user is equipped with privileges to SQL server. Normally this can be a user which you logged with having once you put in SQL Server or the user that is the default owner on computer. To obtain a set of user on the unit (Assuming Microsoft windows Server 2012 or above).

Step-By-Step

Launch Server Manager – Start – Click Server Manager



In the upper right corner, click Tools, then Computer Management



Expand Local Users and Groups (Under Computer Management-System Tools) then click Users

Notice under Description… There is a Built-in account for administering… This is very likely an account that has access to SQL server.  Try logging onto windows with that account that is Built-in account for administering then we can grant rights to the user you want to use to login to SQL Server.



Now that you are logged into Windows with an account that has access to connect to SQL Server, Let’s go in and grant rights to the user that you want to be able to use to access SQL Server.

Step-By-Step

Launch SQL Server Management Studio again and you should be able to Connect

Expand your ServerName, then Expand Security, then Logins.  Click Logins

Right-Click Logins Select New Login



Click the Search Button

Type in the Windows User Name you would like to add as an SQL Administrator then click Check Names  (or you can click advanced and select from a list)



Click Server Roles in the left pane; then turn on the sysadmin checkbox, then click OK



You can then confirm you have sa rights by double-clicking the user name in the left Object Explorer and clicking on Securables



Now you can log out of Windows, login as the user you just granted rights to and all should work fine.

Best Windows Server 2008 Hosting Recommendation

ASPHostPortal.com provides our customers with Plesk Panel, one of the most popular and stable control panels for Windows hosting, as free. You could also see the latest .NET framework, a crazy amount of functionality as well as Large disk space, bandwidth, MSSQL databases and more. All those give people the convenience to build up a powerful site in Windows server 2008. We offers Windows hosting starts from $5/month only. We also guarantees 30 days money back and guarantee 99.9% uptime. If you need a reliable affordable Windows server 2008 Hosting, we should be your best choice.



SQL 2014 Hosting - ASPHostPortal.com :: Handling Divide BY Zero Exception

clock March 16, 2015 07:24 by author Ben

Some instances whilst undertaking Calculation within your query you got an error or exception "Divide by Zero" and some time you will have Output value Like NULL So how you can manage these Problem and Exception.

Use NULLIF -To Manage Divide by zero Exception
Use ISNULL -To Show some worth rather of NULL in your output

Below will be the total explanation
select 10/0

in case you run the above query it is going to throw an error

Msg 8134, Level 16, State 1, Line 1
Divide by zero error encountered.

Now to be able to Resolve this we'll use Nullif function

 select nullif (10/ nullif (0,0),0)

Output NULL

Now it's going to not throw an error and your output will likely be null. Explanation: NULLIF function takes two arguments verify wheather they're equal or not if they are equal it'll return NULL as output
IF both expressions are not equal it is going to return the very first expression with identical data kind. Now inside the denominator it checked wheather 0 is equal to 0 The conditon was true it returned Null in denominator. now the expression was like

select Nullif(10/null,0)

now again it will check wheather 10/Null is equal to zero condition fails and you will get.

Output  NULL

Note : AnyNumber divide multiplied,added subtracted with NULL will Resultant to NULL only. Now how to use NULLIF and ISNULL in your Code or in TSQL to avoid exception and NULL Values. In the below code i have used nullif with ISNULL function. For this we will  Create a table

CREATE TABLE Test_NULLIF1
(
   col1            int   IDENTITY,
   col2      int   NULL,
   col3   int   NULL
);
INSERT Test_NULLIF1  VALUES(10, 20);
INSERT Test_NULLIF1 VALUES(NULL, 23);
INSERT Test_NULLIF1 VALUES(0, 18);
INSERT Test_NULLIF1 VALUES(NULL,75);
INSERT Test_NULLIF1 VALUES(300000, 21);
Go

If you will divide column 3 by column 2 for Row 3 it will throw Divide By zero Exception Run this query to generate error.. select col3/col2  from Test_NULLIF1  where col1 =3

Msg 8134, Level 16, State 1, Line 1
Divide by zero error encountered.

In order to handle this Use nullif function as explained in above example

select nullif(col3/nullif(col2,0),0)  from Test_NULLIF1  where col1 =3

Output NULL

Now You can Use ISNULL function after handling divide by zero exception to give a value when you have output like NULL
for eg like instead of NULL you want to show value like 0 or 1 or anything

select ISNULL(col3/nullif(col2,0),1)  from Test_NULLIF1  where col1 =3

Output 1

Explanation: ISNULL Function takes two argument check wheather first expression is NULL if null provide a replacement for that NULL in second expression it can be any value.
IN query ISNULL checked the first expression it was NULL so it replaced the NULL Value with 1,
and return output  as 1.

IN case if first expression is not null it willl return the first expression value only. .

Test IsNULL function with below queries

eg
Select ISNULL(null,2)

output 2

select isnull(3,1)--as first expression is not null so it returns first value i.e. 3

output 3

Best SQL 2014 Hosting Recommendation

ASPHostPortal.com

ASPHostPortal.com provides our customers with Plesk Panel, one of the most popular and stable control panels for Windows hosting, as free. You could also see the latest .NET framework, a crazy amount of functionality as well as Large disk space, bandwidth, MSSQL databases and more. All those give people the convenience to build up a powerful site in Windows server. We offers SQL 2014 hosting starts from $5/month only. We also guarantees 30 days money back and guarantee 99.9% uptime. If you need a reliable affordable SQL 2014 Hosting, we should be your best choice.



SQL 2014 Hosting - ASPHostPortal.com :: Recovering SQL Server Database from Suspect Mode

clock November 13, 2014 06:49 by author Mark

How Recovering SQL Server Database from Suspect Mode ??

Sometimes we have to face a critical situation when SQL Server database going to Suspect Mode. In that moment no work can be done on database. Database may go into suspect mode because the primary file group is damaged and the database cannot be recovered during the startup of the SQL Server

Reason for database to go into suspect mode:

  • Data files or log files are corrupt.
  • Database server was shut down improperly
  • Lack of Disk Space
  • SQL cannot complete a rollback or roll forward operation

How to recover database from suspect mode:

  • Change the status of your database. Suppose database name is “BluechipDB”?

EXEC sp_resetstatus '';
Example:
EXEC sp_resetstatus 'BlueChipDB'

  • Set the database in “Emergency” mode

ALTER DATABASE  SET EMERGENCY;
Example:
ALTER DATABASE BlueChipDB SET EMERGENCY

  • Check the database for any inconsistency

DBCC CHECKDB('');
Example:
DBCC checkdb('BlueChipDB')

If you get any error after executing DBCC CHECKDB then immediately bring the database in SINGLE USER MODE by running following query. If no error found then you need not execute the following query.

ALTER DATABASE  SET SINGLE_USER WITH ROLLBACK IMMEDIATE
Example:
ALTER DATABASE BlueChipDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE

  • For safety, take the backup of the database.
  • Run the following query as next step.Remember while using the below query, that uses REPAIR_ALLOW_DATA_LOSS, is a one way operation that is once the database is repaired all the actions performed by these queries can’t be undone.
  • There is no way to go back to the previous state of the database.
  • So as a precautionary step you should take backup of your database in step 5 mentioned above.

DBCC CHECKDB ('', REPAIR_ALLOW_DATA_LOSS);
Example:
DBCC CheckDB ('BlueChipDB', REPAIR_ALLOW_DATA_LOSS)

  • Finally, bring the database in MULTI USER mode

ALTER DATABASE  SET MULTI_USER;
ALTER DATABASE [BlueChipDB]  SET MULTI_USER

  •  Refresh your database server and verify the connectivity of your database. Now users should be able to connect to the database properly. If any data loss, you can restore database – backup taken in step 5.

 



SQL 2014 Hosting Tutorial - ASPHostPortal.com :: SQL Server 2014 Analysis, Migrate, and Report Tool

clock November 10, 2014 11:28 by author Mark

Determine which tables and stored procedures would benefit from In-Memory OLTP

With SQL Server 2014 new In-Memory OLTP engine, you can load tables and stored procedures in memory, which provides very fast response times. The goal isn't to load all the database tables and stored procedures in memory, but rather just those tables that are crucial to performance and those stored procedures that have complex logical calculations.

To help you identify which tables and stored procedures will give you the best performance gain after being migrated to In-Memory OLTP, SQL Server 2014 provides the new Analysis, Migrate, and Report (AMR) tool. Built into SQL Server Management Studio (SSMS), the AMR tool consists of the:

  • Transaction performance collector (which collects data about existing tables and stored procedures in order to analyze workloads) and transaction performance analysis reports (which gives recommendations about the tables and stored procedures to migrate to In-Memory OLTP based on the collected data)
  • Memory Optimization Advisor (which guides you through the process of migrating a table to a memory-optimized table)
  • Native Compilation Advisor (which helps you identify T-SQL elements that need to be changed before migrating a stored procedure to a natively compiled stored procedure)

The AMR tool leverages the new Transaction Performance Collection Sets for gathering information about workloads and the Management Data Warehouse (a relational database) to store the collected data. The Transaction Performance Collection Sets includes the:

  • Stored Procedure Usage Analysis collection set (which captures information about stored procedures for a future migration to natively compiled stored procedures)
  • Table Usage Analysis collection set (which captures information about disk-based tables for a future migration to memory optimized tables)

Before you can use the AMR tool, you need to configure the Management Data Warehouse and the data collection process. After showing you how to do so, I'll demonstrate how to run the transaction performance analysis reports and how to use the two advisors.

Configuring the Management Data Warehouse

To configure the Management Data Warehouse, go to Object Explorer in SSMS. Expand the Management folder, right-click Data Collection, select Tasks, and click Configure Management Data Warehouse. This will launch the Configure Management Data Warehouse Wizard.

After the Welcome page, you'll find yourself on the Select Configuration Task page. On this page, select the option to configure a Management Data Warehouse.

  • On the Configure Management Data Warehouse Storage page, you need to specify the name of database that will host the Management Data Warehouse and the name of the server on which that database resides. If you need to create the database, click the New button to create one.
  • On the Map Logins and Users page, you'll find the existing logins allowed for the server that will host the Management Data Warehouse. If needed, you can edit the logins or map users to the administrator, reader, and writer roles for the Management Data Warehouse.
  • On the Complete the Wizard page, you need to verify the Management Data Warehouse configuration. If it's OK, click Finish. When the configuration of the Management Data Warehouse has successfully completed, you should see a page like that in Figure 1.

Figure 1: Verifying Configuration of the Management Data Warehouse

The Management Data Warehouse setup is now finished.

Configuring the Data Collection Process

To configure the data collection process, go to Object Explorer in SSMS. Expand the Management folder, right-click Data Collection, select Tasks, and click Configure Data Collection. This will launch the Configure Data Collection Wizard.

After the Welcome page, you'll find the Setup Data Collection Sets page shown in Figure 2. Besides needing to specify the server and database that will host the Management Data Warehouse, you need to specify the data collector sets. In the list of collection sets, select the Transaction Performance Collection Sets check box so that the data collector will collect statistics for transaction performance issues.

Figure 2: Specifying the Data Collector Sets

If the Management Data Warehouse is located on a different SQL Server instance from the data collector and if SQL Server Agent isn't running under a domain account that has dc_admin permissions on the remote instance, you have to use a SQL Server Agent proxy. If that's the case, be sure to select the Use a SQL Server Agent proxy for remote uploads check box.

Once you're done configuring the Setup Data Collection Sets page, click Finish. When the wizard completes the configuration, you'll have an enabled data collection process that will collect information about all user databases. Note that SQL Server Agent must be started on the instance that will collect the data.

In the SQL Server Agent's Jobs folder, you'll see the jobs used to collect data from your workloads and the jobs used to upload the collected data into the Management Data Warehouse. The data collection jobs use the naming convention collection_set_N_collection, where N is a number. The upload jobs use the naming convention collection_set_N_upload, where N is a number.

By default, the AMR tool collects data from three dynamic management views every 15 minutes for both the Stored Procedure Usage Analysis and Table Usage Analysis collection sets. The upload job runs every 30 minutes for the Stored Procedure Usage Analysis collection set and every 15 minutes for the Table Usage Analysis collection set. If you want to speed your upload, you can execute these jobs manually. Uploading the data has a minimal impact on performance.

Running the Transaction Performance Analysis Reports

To access the recommendations based on the information collected about all your user databases on the workload server, you need to run the transaction performance analysis reports. To access them, right-click your Management Data Warehouse database, select Reports, choose Management Data Warehouse, and click Transaction Performance Analysis. From the Transaction Performance Analysis Overview page, you can choose to run three reports, depending on what type of information you need:

  • Recommended Tables Based on Usage
  • Recommended Tables Based on Contention
  • Recommended Stored Procedures Based on Usage

Recommended Tables Based on Usage. This report tells you which tables are the best candidates for migration to In-Memory OLTP based on their usage. Figure 3 shows a sample report. On the left side, you can select the database and how many tables you'd like to see from that database. The chart will then show the selected tables. The horizontal axis represents the amount of work needed to migrate a table to In-Memory OLTP. The vertical axis represents the gains you'll achieve after migrating the table. The best candidates for In-Memory OLTP are located in the top right corner. As you can see, they can be easily migrated and will give you the best performance gain.

Figure 3: Determining Which Tables Are the Best Candidates for Migration Based on Usage

You can access a detailed report for a table by clicking its name in the chart. As Figure 4 shows, this report provides the table's access statistics (e.g., lookups, range scan) and contention statistics (e.g., latches, locks), as well as when this information was captured.

Figure 4: Reviewing the Detailed Performance Statistics for a Table

Recommended Tables Based on Contention. This report tells you which tables are the best candidates for migration to In-Memory OLTP based on their contention. If you compare the contention analysis report in Figure 5 with the usage analysis report in Figure 3, you'll see that they're very similar.

Figure 5: Determining Which Tables Are the Best Candidates for Migration Based on Contention

You can select the database and how many tables you'd like to see from that database. The resulting chart shows the amount of work needed to migrate the tables (horizontal axis) and the gains you'll achieve after migrating them (vertical axis). In the top right corner, you'll find the best candidates for migration based on contention. You can click a table name in the chart to access a detailed report showing the table's statistics. This report provides the table's access and contention statistics.

Recommended Stored Procedures Based on Usage. This report shows you which stored procedures are the top candidates for an In-Memory OLTP migration based on their usage (i.e., total CPU time). After selecting the database and how many stored procedures you'd like to see from that database, the resulting chart shows the top candidates for migration, as Figure 6 shows.

Figure 6: Seeing Which Stored Procedures Are the Top Candidates for Migration Based on Usage

If you want to see the detailed usage statistics for a specific stored procedure, you can click its blue bar. Figure 7 shows an example of the report you'll receive.

Figure 7: Reviewing the Detailed Usage Statistics for a Stored Procedure

Using the Memory Optimization Advisor

  • After you know which tables you want to migrate to In-Memory OLTP, you can use the AMR tool's Memory Optimization Advisor to help you with the migration process. To access this advisor, open Object Explorer in SSMS and navigate to the table you want to migrate. Right-click the table and choose Memory Optimization Advisor.
  • The advisor will launch with the Introduction page, which you can read or skip. Clicking Next brings you to the Migration Optimization Checklist page, where the advisor will check to see if your table can be migrated. If one or more validation items fail, the migration process will stop. If needed, you can generate a report for this analysis. If all you see are green checkmarks, your table doesn't have any features that could prevent the migration process, in which case you can proceed to the next page.
  • On the Migration Optimization Warnings page, you'll find important information about what isn't supported in memory-optimized tables and other types of issues. The issues listed won't prevent the table from being migrated, but they might cause other objects to fail or behave in an unexpected manner.

If a warning applies to the table you selected for migration, an exclamation point in a yellow triangle will appear next to the warning, as shown in Figure 8.

Figure 8: Reviewing the Migration Optimization Warnings

In this case, the selected table has an unsupported French_CI_AS collation on the indexed column named Person_OnDisk_Name. (Only BIN2 collations are supported for indexes in memory-optimized tables.) Thus, the index collation will need to be changed later in the migration process.

Figure 9: Reviewing the Optimization Options

On the Review Optimization Options page, which Figure 9 shows, you have the option to change the defaults listed for the:

  • Name of memory-optimized file group (only one memory-optimized file group is allowed per instance)
  • Logical filename
  • Path where the logical file will be saved
  • New name given to the original table (the original table is renamed to prevent naming conflicts)

You can also choose to copy data from the original table to the new memory-optimized table during the migration process, and you can change the durability of the memory-optimized table. By default, its DURABILITY option will be set to schema_and_data, but you can change it to schema_only by selecting the Check this box to migrate this table to a memory-optimized table with no data durability option. If you do so, the data will be lost after the SQL Server service is restarted. In other words, just the table's schema is persistent. Finally, the Review Optimization Options page shows the estimated current memory cost for the memory-optimized table. If there isn't sufficient memory, the migration process might fail.

Once you're done with the Review Optimization Options page, you can click Next to go to the Review Primary Key Conversion page. When the migration process begins, it will start by converting the primary key. You can convert it to:

  • A nonclustered hash index, which gives the best performance for point lookups. If you select this option, you also need to specify the bucket count, which should be twice the expected number of rows.
  • A nonclustered index, which gives the best performance for range predicates.

For each index you have in the table being migrated, you'll be presented with a Review Index Conversion page that has been populated with the columns and data types for that index. The options you can configure in the Review Index Conversion page are similar to those in the Review Primary Key Conversion page. In this case, for the indexed column Person_OnDisk_Name with the unsupported French_CI_AS collation, you'd have to select BIN2 collation as the Char data type.

On the Verify Migration Actions page, you'll see all operations that will be performed to migrate your table to In-Memory OLTP. You have the option to script those operations by clicking the Script button. After verifying all the options, you can click the Migrate button to start the migration process.

Figure 10 shows how the new memory-optimized table appears in SSMS. If you view its properties, you'll see that the Memory optimized property is set to True and that the schema and data are durable for this table.

Figure 10: New Memory-Optimized Table in SSMS

In Figure 10, you can also see how the original table has been renamed.

Using the Native Compilation Advisor

After you know which stored procedures you want to migrate to In-Memory OLTP, you can use the AMR tool's Native Compilation Advisor to help you with their migration. To access this advisor, open Object Explorer in SSMS and navigate to the stored procedure you want to migrate. Right-click the stored procedure and choose Native Compilation Advisor.

After clicking through the Welcome page, you'll be presented with the Stored Procedure Validation page, which will give you warnings if your stored procedure contains some T-SQL elements that aren't supported by native compilation. If the stored procedure is valid, it can become a natively compiled stored procedure without modification. However, the Native Compilation Advisor doesn't migrate stored procedures like the Memory Optimization Advisor migrates tables. You have to do the migration on your own.

If the stored procedure has unsupported T-SQL elements, the validation will fail. To see the details about the unsupported elements, you need to click Next to go to the Stored Procedure Validation Result page, which Figure 11 shows.

Figure 11: Reviewing the Unsupported T-SQL Elements

You have to modify the unsupported elements before you can migrate your stored procedure to a natively compiled stored procedure.

Eliminate the Guesswork

The AMR tool is useful because it eliminates the guesswork in determining which tables and stored procedures would benefit from In-Memory OLTP. After identifying which tables to migrate, you can use the Memory Optimization Advisor to quickly migrate them. Although the Native Compilation Advisor can help you identify the T-SQL elements you need to change before migrating your stored procedure to a natively compiled one, it unfortunately doesn't guide you through the migration process.

 



SQL 2014 Hosting Tutorial - ASPHostPortal.com :: How to Add Data in SQL Server 2014

clock October 31, 2014 06:03 by author Ben

In the earlier publish, we all know about The Best Way to Create a Table in SQL Server 2014. We now possess a databases, and table, but no data.

There are lots of techniques of obtaining data into your database in SQL Server 2014. Listed here are the primary ones that come to mind:

  • Manually: Type data directly into your table rows.
  • Copy/Paste: Similar to the earlier alternative, but this 1 is in which you duplicate data from an additional source, then paste it into a table within your databases.
  • Import: You are able to utilize the Import and Export Wizard to import info from yet another source.
  • SQL Scripts: You can operate a SQL script which contains all info to insert.
  • Application/Website: Consumers update the database via an application or site.

Here's much more element on each of these methods.

Manually

We will make use of the Edit Top 200 Rows option to manually kind info directly in to the table rows.

Manually getting into data is Okay should you have only just a little little bit of data to enter. But it's a little bit clunky and might impractical if you have lots of information. Furthermore it doesn't actually match most company requirements, in which non-technical customers need to be capable of update the database.

In any case, here's how to manually enter data directly into the table :

  1. In the Object Explorer, right click on the table you wish to open, and select Edit Top 200 Rows:

  2. You can now start entering the data directly into your table.

Copy/Paste

You could use a similar method towards the above by copying from an additional datasource and pasting into your databases table. Of course, this may require which the resource table has the same columns since the destination table. Comparable towards the manual method above, this really is Okay to get a small quantity of information but not for any good deal of data.

Here's how to copy/paste into your table:

  1. Select all required records from the datasource
  2. In the destination database (i.e. the one you want to populate with data), right-click on the destination table and select Edit Top 200 Rows
  3. Select an empty row by right-clicking in the left-most column (it's more of a button to the left of your left-most column that allows you to select the whole row) and select Paste from the contextual menu:


    If you need to paste more than 200 rows, click the Show SQL Pane icon from the toolbar to display the SQL statement behind the 200 rows being displayed. Simply change the 200 to a larger number of rows as required.



    Note that this will work up to a certain extent, but you may encounter times where have so much data to transfer that you need to use another method.

Import

You are able to import data from yet another datasource. The top end result is comparable towards the copy/paste method (i.e. information is copied across to the location databases), but importing the data is a lot more flexible and could be more appropriate on several occasions. For instance, you can choose info from several views and tables, and you can compose a question on the fly to import only the data you'll need.

To import data, right-click on the database and select Tasks > Import Data... and follow the Wizard from there.

The SQL Server Import and Export Wizard can copy data to and from any data source for which a managed .NET Framework data provider or a native OLE DB provider is available. These include:

  • SQL Server
  • Flat files
  • Microsoft Office Access
  • Microsoft Office Excel

You may also commence the SQL Server Import and Export Wizard in the Windows Commence menu, from inside of SQL Server Data Tools (SSDT), and through the command prompt (by running DTSWizard.exe which you will locate in both C:\Program Files\Microsoft SQL Server\100\DTS\Binn or in C:\Program Files\Microsoft SQL Server\120\DTS\Binn or other location depending on your configuration and push letter).

SQL Scripts

In many circumstances, you'll find it much more efficient to run a SQL script which contains the data you need to insert. You'll be able to make use of the SQL INSERT assertion to insert just the info you specify inside the statement.

SQL scripts are excellent for inserting static/reference data (like say, countries/regions). They are able to be saved and run once more any time it really is required (for example on another database). Scripts aren't generally so excellent for data that continually changes (like customer particulars). You almost certainly would not be maintaining a duplicate of outdated data within a SQL script. But you'll find often exceptions. For example, you may use this kind of a script to populate a client table inside your testing/development atmosphere.

More about SQL scripts arising.

Application/Website

Most SQL Server databases are the backend data storage for any front-end application. Customers from the application are responsible for including data towards the databases (as well as editing it). As a result, many of the tables in your database will likely be up-to-date via the application. Within this situation, the applying is updating the database utilizing SQL scripts.

The main difference between these scripts and the kinds we mentioned over is the fact that, the scripts being used within the software will probably be dynamic. They'll accept data as a parameter that is handed to the script. Hence the user can enter say, an e-mail tackle in to the application, and unbeknownst to him, the application runs a SQL script that requires his e-mail tackle, validates it, provides it to the script, and when it passes each of the business/validation principles, inserts it in to the database.

These SQL scripts may be placed directly into your site code (PHP, ColdFusion and so forth), or they're able to be stored within the databases as Saved Processes or Views and operate only when the application says so.




SQL 2014 Hosting Tutorial :: How to Create Histogram Chart in SQL

clock September 15, 2014 08:42 by author Ben

SQL Server 2014 will be an amazing release regarding all the various performance enhancements that are part of this new release. And know I wanna tell you about how to create Histogram Chart in SQL.

In this tips we will create a histogram chart in SQL 2014 to show how to aggregate data will very little effort.

Step 1
Let's create a sample table called Employee that has columns EmpID and EmpAge with the data as shown in the below screenshot.

Step 2
Create a new SSRS report and configure it to use the data from the table we just created. Add a bar chart to the report and configure it as shown in the below screenshot.

Step 3
Execute/Preview this report and your report should look like the below screenshot and you will see data listed for all fifteen employees.

In reality, an employee table can contain records for hundreds to thousands of employees and therefore if the requirement is to analyze age distribution this type of report won't serve the purpose. A typical approach to solve this issue would be to retrieve the count of employees falling into a few predefined age categories and show it in a distribution chart like a pie-chart. In the next few steps we will see how a histogram chart can be a much simpler approach.

Step 4
Select the EmpAge chart series and open the properties window. In the CustomAttribute category, select "ShowColumnAs" property and set the value as "Histogram" as shown in the below screenshot.

Step 5
Execute the report and you should find results similar to the below screenshot. This is a histogram chart, but the problem with this chart is that by default the chart has selected all fifteen distributions, so you get one employee for each age category which is not very helpful.

Step 6
Let's say we want to see the data in three age groups 20 - 25, 25 - 30 and 30 - 35 and the number of employees in each age group. In the CustomAttribute category, you will find a property "HistogramSegmentIntervalWidth" and the default value is zero. Change this value to "5", so that the age group we have is divided into 3 categories as desired. Execute the report and the report should look like the below screenshot. This chart shows 4 employees in the 20 - 25 group, 5 employees in the 25 - 30 group and 6 employees in the 30 - 35 group.

Also note that the "Percentage of Total" axis is generated and calculated automatically without any programming efforts and this axis is very useful in statistical analysis. This type of axis is not possible out-of-box in charts like pie-charts and hence charts such as histogram charts are preferred.



Reason Why you Must Choose ASPHostPortal.com
ASPHostPortal.com offers the newest Microsoft SQL 2014 database with all our hosting plans. We are your dependable SQL web hosting provider who will fulfill your needs!


Full Remote Access
We allow you full remote connectivity to your SQL Server 2014 Hosting database and do not restrict access in any way.

Easily transfer your existing SQL Server database
With our SQL Server hosting package, there's no need to rebuild your database from scratch should you wish to transfer an existing SQL Server database to us. If you already have a database hosted elsewhere, you can easily transfer the contents of your database using SQL Server Management Studio which is fully supported by our packages. SSMS provides you with an Import/Export wizard which you can use to upload your data and stored procedures with a couple of clicks.



SQL Server 2014 Hosting with ASPHostPortal.com :: How to Restart an Interrupted SQL Server Database Restore

clock August 23, 2014 09:53 by author Kenny

SQL Server is Microsoft's relational database management system (RDBMS). It is a full-featured databse primarily designed to compete against competitors Oracle Database (DB) and MySQL.

Like all major RBDMS, SQL Server supports ANSI SQL, the standard SQL language. However, SQL Server also contains T-SQL, its own SQL implemention. SQL Server Management Studio (SSMS) (previously known as Enterprise Manager) is SQL Server’s main interface tool, and it supports 32-bit and 64-bit environments.

In this article, we will tell you about how to restart an interrupted SQL Server Database Restore.
Have you ever restored a large database on a Failover Cluster Production Server and while the restore was in progress, due to network failure, the restore failed? Once the SQL Server came up on the other node all the databases came up, except for the database which you were restoring prior to the failover. In this tip we will take a look at the command RESTORE DATABASE...WITH RESTART to see how this command can be helpful during such scenarios.

Here is the solution

The RESTORE DATABASE...WITH RESTART command is a very useful command which is available in SQL Server 2005 and higher versions. A Database Administrator can use this command to finish restoring an interrupted database restore operation.

In the below snippet you can see that ProductDB is in a (Restoring...) state once the SQL Server came online after the unexpected failure.

During such scenarios one can execute the RESTORE DATABASE...WITH RESTART command to successfully complete the database restore operation.

Below are two commands.  The first gets a list of the backups on the file and the second does the actual restore with the restart option.

-- get backup information from backup file
RESTORE FILELISTONLY
FROM DISK ='C:\DBBackups\ProductDB.bak'
GO

-- restore the database
RESTORE DATABASE ProductDB
FROM DISK ='C:\DBBackups\ProductDB.bak'
WITH RESTART
GO

Below you can see that after running the RESTORE DATABASE...WITH RESTART command the database was successfully restored allowing user connectivity.



About ASPHostPortal.com

We’re a company that works differently to most. Value is what we output and help our customers achieve, not how much money we put in the bank. It’s not because we are altruistic. It’s based on an even simpler principle. "Do good things, and good things will come to you".

Success for us is something that is continually experienced, not something that is reached. For us it is all about the experience – more than the journey. Life is a continual experience. We see the Internet as being an incredible amplifier to the experience of life for all of us. It can help humanity come together to explode in knowledge exploration and discussion. It is continual enlightenment of new ideas, experiences, and passions


Author Link


Corporate Address (Location)

ASPHostPortal
170 W 56th Street, Suite 121
New York, NY 10019
United States

Sign in