Windows 2012 Hosting - MVC 6 and SQL 2014 BLOG

Tutorial and Articles about Windows Hosting, SQL Hosting, MVC Hosting, and Silverlight Hosting

SQL 2014 Hosting - ASPHostPortal.com :: Handling Divide BY Zero Exception

clock March 16, 2015 07:24 by author Ben

Some instances whilst undertaking Calculation within your query you got an error or exception "Divide by Zero" and some time you will have Output value Like NULL So how you can manage these Problem and Exception.

Use NULLIF -To Manage Divide by zero Exception
Use ISNULL -To Show some worth rather of NULL in your output

Below will be the total explanation
select 10/0

in case you run the above query it is going to throw an error

Msg 8134, Level 16, State 1, Line 1
Divide by zero error encountered.

Now to be able to Resolve this we'll use Nullif function

 select nullif (10/ nullif (0,0),0)

Output NULL

Now it's going to not throw an error and your output will likely be null. Explanation: NULLIF function takes two arguments verify wheather they're equal or not if they are equal it'll return NULL as output
IF both expressions are not equal it is going to return the very first expression with identical data kind. Now inside the denominator it checked wheather 0 is equal to 0 The conditon was true it returned Null in denominator. now the expression was like

select Nullif(10/null,0)

now again it will check wheather 10/Null is equal to zero condition fails and you will get.

Output  NULL

Note : AnyNumber divide multiplied,added subtracted with NULL will Resultant to NULL only. Now how to use NULLIF and ISNULL in your Code or in TSQL to avoid exception and NULL Values. In the below code i have used nullif with ISNULL function. For this we will  Create a table

CREATE TABLE Test_NULLIF1
(
   col1            int   IDENTITY,
   col2      int   NULL,
   col3   int   NULL
);
INSERT Test_NULLIF1  VALUES(10, 20);
INSERT Test_NULLIF1 VALUES(NULL, 23);
INSERT Test_NULLIF1 VALUES(0, 18);
INSERT Test_NULLIF1 VALUES(NULL,75);
INSERT Test_NULLIF1 VALUES(300000, 21);
Go

If you will divide column 3 by column 2 for Row 3 it will throw Divide By zero Exception Run this query to generate error.. select col3/col2  from Test_NULLIF1  where col1 =3

Msg 8134, Level 16, State 1, Line 1
Divide by zero error encountered.

In order to handle this Use nullif function as explained in above example

select nullif(col3/nullif(col2,0),0)  from Test_NULLIF1  where col1 =3

Output NULL

Now You can Use ISNULL function after handling divide by zero exception to give a value when you have output like NULL
for eg like instead of NULL you want to show value like 0 or 1 or anything

select ISNULL(col3/nullif(col2,0),1)  from Test_NULLIF1  where col1 =3

Output 1

Explanation: ISNULL Function takes two argument check wheather first expression is NULL if null provide a replacement for that NULL in second expression it can be any value.
IN query ISNULL checked the first expression it was NULL so it replaced the NULL Value with 1,
and return output  as 1.

IN case if first expression is not null it willl return the first expression value only. .

Test IsNULL function with below queries

eg
Select ISNULL(null,2)

output 2

select isnull(3,1)--as first expression is not null so it returns first value i.e. 3

output 3

Best SQL 2014 Hosting Recommendation

ASPHostPortal.com

ASPHostPortal.com provides our customers with Plesk Panel, one of the most popular and stable control panels for Windows hosting, as free. You could also see the latest .NET framework, a crazy amount of functionality as well as Large disk space, bandwidth, MSSQL databases and more. All those give people the convenience to build up a powerful site in Windows server. We offers SQL 2014 hosting starts from $5/month only. We also guarantees 30 days money back and guarantee 99.9% uptime. If you need a reliable affordable SQL 2014 Hosting, we should be your best choice.



SQL 2014 Hosting Tutorial - ASPHostPortal.com :: SQL Server 2014 Analysis, Migrate, and Report Tool

clock November 10, 2014 11:28 by author Mark

Determine which tables and stored procedures would benefit from In-Memory OLTP

With SQL Server 2014 new In-Memory OLTP engine, you can load tables and stored procedures in memory, which provides very fast response times. The goal isn't to load all the database tables and stored procedures in memory, but rather just those tables that are crucial to performance and those stored procedures that have complex logical calculations.

To help you identify which tables and stored procedures will give you the best performance gain after being migrated to In-Memory OLTP, SQL Server 2014 provides the new Analysis, Migrate, and Report (AMR) tool. Built into SQL Server Management Studio (SSMS), the AMR tool consists of the:

  • Transaction performance collector (which collects data about existing tables and stored procedures in order to analyze workloads) and transaction performance analysis reports (which gives recommendations about the tables and stored procedures to migrate to In-Memory OLTP based on the collected data)
  • Memory Optimization Advisor (which guides you through the process of migrating a table to a memory-optimized table)
  • Native Compilation Advisor (which helps you identify T-SQL elements that need to be changed before migrating a stored procedure to a natively compiled stored procedure)

The AMR tool leverages the new Transaction Performance Collection Sets for gathering information about workloads and the Management Data Warehouse (a relational database) to store the collected data. The Transaction Performance Collection Sets includes the:

  • Stored Procedure Usage Analysis collection set (which captures information about stored procedures for a future migration to natively compiled stored procedures)
  • Table Usage Analysis collection set (which captures information about disk-based tables for a future migration to memory optimized tables)

Before you can use the AMR tool, you need to configure the Management Data Warehouse and the data collection process. After showing you how to do so, I'll demonstrate how to run the transaction performance analysis reports and how to use the two advisors.

Configuring the Management Data Warehouse

To configure the Management Data Warehouse, go to Object Explorer in SSMS. Expand the Management folder, right-click Data Collection, select Tasks, and click Configure Management Data Warehouse. This will launch the Configure Management Data Warehouse Wizard.

After the Welcome page, you'll find yourself on the Select Configuration Task page. On this page, select the option to configure a Management Data Warehouse.

  • On the Configure Management Data Warehouse Storage page, you need to specify the name of database that will host the Management Data Warehouse and the name of the server on which that database resides. If you need to create the database, click the New button to create one.
  • On the Map Logins and Users page, you'll find the existing logins allowed for the server that will host the Management Data Warehouse. If needed, you can edit the logins or map users to the administrator, reader, and writer roles for the Management Data Warehouse.
  • On the Complete the Wizard page, you need to verify the Management Data Warehouse configuration. If it's OK, click Finish. When the configuration of the Management Data Warehouse has successfully completed, you should see a page like that in Figure 1.

Figure 1: Verifying Configuration of the Management Data Warehouse

The Management Data Warehouse setup is now finished.

Configuring the Data Collection Process

To configure the data collection process, go to Object Explorer in SSMS. Expand the Management folder, right-click Data Collection, select Tasks, and click Configure Data Collection. This will launch the Configure Data Collection Wizard.

After the Welcome page, you'll find the Setup Data Collection Sets page shown in Figure 2. Besides needing to specify the server and database that will host the Management Data Warehouse, you need to specify the data collector sets. In the list of collection sets, select the Transaction Performance Collection Sets check box so that the data collector will collect statistics for transaction performance issues.

Figure 2: Specifying the Data Collector Sets

If the Management Data Warehouse is located on a different SQL Server instance from the data collector and if SQL Server Agent isn't running under a domain account that has dc_admin permissions on the remote instance, you have to use a SQL Server Agent proxy. If that's the case, be sure to select the Use a SQL Server Agent proxy for remote uploads check box.

Once you're done configuring the Setup Data Collection Sets page, click Finish. When the wizard completes the configuration, you'll have an enabled data collection process that will collect information about all user databases. Note that SQL Server Agent must be started on the instance that will collect the data.

In the SQL Server Agent's Jobs folder, you'll see the jobs used to collect data from your workloads and the jobs used to upload the collected data into the Management Data Warehouse. The data collection jobs use the naming convention collection_set_N_collection, where N is a number. The upload jobs use the naming convention collection_set_N_upload, where N is a number.

By default, the AMR tool collects data from three dynamic management views every 15 minutes for both the Stored Procedure Usage Analysis and Table Usage Analysis collection sets. The upload job runs every 30 minutes for the Stored Procedure Usage Analysis collection set and every 15 minutes for the Table Usage Analysis collection set. If you want to speed your upload, you can execute these jobs manually. Uploading the data has a minimal impact on performance.

Running the Transaction Performance Analysis Reports

To access the recommendations based on the information collected about all your user databases on the workload server, you need to run the transaction performance analysis reports. To access them, right-click your Management Data Warehouse database, select Reports, choose Management Data Warehouse, and click Transaction Performance Analysis. From the Transaction Performance Analysis Overview page, you can choose to run three reports, depending on what type of information you need:

  • Recommended Tables Based on Usage
  • Recommended Tables Based on Contention
  • Recommended Stored Procedures Based on Usage

Recommended Tables Based on Usage. This report tells you which tables are the best candidates for migration to In-Memory OLTP based on their usage. Figure 3 shows a sample report. On the left side, you can select the database and how many tables you'd like to see from that database. The chart will then show the selected tables. The horizontal axis represents the amount of work needed to migrate a table to In-Memory OLTP. The vertical axis represents the gains you'll achieve after migrating the table. The best candidates for In-Memory OLTP are located in the top right corner. As you can see, they can be easily migrated and will give you the best performance gain.

Figure 3: Determining Which Tables Are the Best Candidates for Migration Based on Usage

You can access a detailed report for a table by clicking its name in the chart. As Figure 4 shows, this report provides the table's access statistics (e.g., lookups, range scan) and contention statistics (e.g., latches, locks), as well as when this information was captured.

Figure 4: Reviewing the Detailed Performance Statistics for a Table

Recommended Tables Based on Contention. This report tells you which tables are the best candidates for migration to In-Memory OLTP based on their contention. If you compare the contention analysis report in Figure 5 with the usage analysis report in Figure 3, you'll see that they're very similar.

Figure 5: Determining Which Tables Are the Best Candidates for Migration Based on Contention

You can select the database and how many tables you'd like to see from that database. The resulting chart shows the amount of work needed to migrate the tables (horizontal axis) and the gains you'll achieve after migrating them (vertical axis). In the top right corner, you'll find the best candidates for migration based on contention. You can click a table name in the chart to access a detailed report showing the table's statistics. This report provides the table's access and contention statistics.

Recommended Stored Procedures Based on Usage. This report shows you which stored procedures are the top candidates for an In-Memory OLTP migration based on their usage (i.e., total CPU time). After selecting the database and how many stored procedures you'd like to see from that database, the resulting chart shows the top candidates for migration, as Figure 6 shows.

Figure 6: Seeing Which Stored Procedures Are the Top Candidates for Migration Based on Usage

If you want to see the detailed usage statistics for a specific stored procedure, you can click its blue bar. Figure 7 shows an example of the report you'll receive.

Figure 7: Reviewing the Detailed Usage Statistics for a Stored Procedure

Using the Memory Optimization Advisor

  • After you know which tables you want to migrate to In-Memory OLTP, you can use the AMR tool's Memory Optimization Advisor to help you with the migration process. To access this advisor, open Object Explorer in SSMS and navigate to the table you want to migrate. Right-click the table and choose Memory Optimization Advisor.
  • The advisor will launch with the Introduction page, which you can read or skip. Clicking Next brings you to the Migration Optimization Checklist page, where the advisor will check to see if your table can be migrated. If one or more validation items fail, the migration process will stop. If needed, you can generate a report for this analysis. If all you see are green checkmarks, your table doesn't have any features that could prevent the migration process, in which case you can proceed to the next page.
  • On the Migration Optimization Warnings page, you'll find important information about what isn't supported in memory-optimized tables and other types of issues. The issues listed won't prevent the table from being migrated, but they might cause other objects to fail or behave in an unexpected manner.

If a warning applies to the table you selected for migration, an exclamation point in a yellow triangle will appear next to the warning, as shown in Figure 8.

Figure 8: Reviewing the Migration Optimization Warnings

In this case, the selected table has an unsupported French_CI_AS collation on the indexed column named Person_OnDisk_Name. (Only BIN2 collations are supported for indexes in memory-optimized tables.) Thus, the index collation will need to be changed later in the migration process.

Figure 9: Reviewing the Optimization Options

On the Review Optimization Options page, which Figure 9 shows, you have the option to change the defaults listed for the:

  • Name of memory-optimized file group (only one memory-optimized file group is allowed per instance)
  • Logical filename
  • Path where the logical file will be saved
  • New name given to the original table (the original table is renamed to prevent naming conflicts)

You can also choose to copy data from the original table to the new memory-optimized table during the migration process, and you can change the durability of the memory-optimized table. By default, its DURABILITY option will be set to schema_and_data, but you can change it to schema_only by selecting the Check this box to migrate this table to a memory-optimized table with no data durability option. If you do so, the data will be lost after the SQL Server service is restarted. In other words, just the table's schema is persistent. Finally, the Review Optimization Options page shows the estimated current memory cost for the memory-optimized table. If there isn't sufficient memory, the migration process might fail.

Once you're done with the Review Optimization Options page, you can click Next to go to the Review Primary Key Conversion page. When the migration process begins, it will start by converting the primary key. You can convert it to:

  • A nonclustered hash index, which gives the best performance for point lookups. If you select this option, you also need to specify the bucket count, which should be twice the expected number of rows.
  • A nonclustered index, which gives the best performance for range predicates.

For each index you have in the table being migrated, you'll be presented with a Review Index Conversion page that has been populated with the columns and data types for that index. The options you can configure in the Review Index Conversion page are similar to those in the Review Primary Key Conversion page. In this case, for the indexed column Person_OnDisk_Name with the unsupported French_CI_AS collation, you'd have to select BIN2 collation as the Char data type.

On the Verify Migration Actions page, you'll see all operations that will be performed to migrate your table to In-Memory OLTP. You have the option to script those operations by clicking the Script button. After verifying all the options, you can click the Migrate button to start the migration process.

Figure 10 shows how the new memory-optimized table appears in SSMS. If you view its properties, you'll see that the Memory optimized property is set to True and that the schema and data are durable for this table.

Figure 10: New Memory-Optimized Table in SSMS

In Figure 10, you can also see how the original table has been renamed.

Using the Native Compilation Advisor

After you know which stored procedures you want to migrate to In-Memory OLTP, you can use the AMR tool's Native Compilation Advisor to help you with their migration. To access this advisor, open Object Explorer in SSMS and navigate to the stored procedure you want to migrate. Right-click the stored procedure and choose Native Compilation Advisor.

After clicking through the Welcome page, you'll be presented with the Stored Procedure Validation page, which will give you warnings if your stored procedure contains some T-SQL elements that aren't supported by native compilation. If the stored procedure is valid, it can become a natively compiled stored procedure without modification. However, the Native Compilation Advisor doesn't migrate stored procedures like the Memory Optimization Advisor migrates tables. You have to do the migration on your own.

If the stored procedure has unsupported T-SQL elements, the validation will fail. To see the details about the unsupported elements, you need to click Next to go to the Stored Procedure Validation Result page, which Figure 11 shows.

Figure 11: Reviewing the Unsupported T-SQL Elements

You have to modify the unsupported elements before you can migrate your stored procedure to a natively compiled stored procedure.

Eliminate the Guesswork

The AMR tool is useful because it eliminates the guesswork in determining which tables and stored procedures would benefit from In-Memory OLTP. After identifying which tables to migrate, you can use the Memory Optimization Advisor to quickly migrate them. Although the Native Compilation Advisor can help you identify the T-SQL elements you need to change before migrating your stored procedure to a natively compiled one, it unfortunately doesn't guide you through the migration process.

 



SQL 2014 Hosting Tutorial - ASPHostPortal.com :: How to Add Data in SQL Server 2014

clock October 31, 2014 06:03 by author Ben

In the earlier publish, we all know about The Best Way to Create a Table in SQL Server 2014. We now possess a databases, and table, but no data.

There are lots of techniques of obtaining data into your database in SQL Server 2014. Listed here are the primary ones that come to mind:

  • Manually: Type data directly into your table rows.
  • Copy/Paste: Similar to the earlier alternative, but this 1 is in which you duplicate data from an additional source, then paste it into a table within your databases.
  • Import: You are able to utilize the Import and Export Wizard to import info from yet another source.
  • SQL Scripts: You can operate a SQL script which contains all info to insert.
  • Application/Website: Consumers update the database via an application or site.

Here's much more element on each of these methods.

Manually

We will make use of the Edit Top 200 Rows option to manually kind info directly in to the table rows.

Manually getting into data is Okay should you have only just a little little bit of data to enter. But it's a little bit clunky and might impractical if you have lots of information. Furthermore it doesn't actually match most company requirements, in which non-technical customers need to be capable of update the database.

In any case, here's how to manually enter data directly into the table :

  1. In the Object Explorer, right click on the table you wish to open, and select Edit Top 200 Rows:

  2. You can now start entering the data directly into your table.

Copy/Paste

You could use a similar method towards the above by copying from an additional datasource and pasting into your databases table. Of course, this may require which the resource table has the same columns since the destination table. Comparable towards the manual method above, this really is Okay to get a small quantity of information but not for any good deal of data.

Here's how to copy/paste into your table:

  1. Select all required records from the datasource
  2. In the destination database (i.e. the one you want to populate with data), right-click on the destination table and select Edit Top 200 Rows
  3. Select an empty row by right-clicking in the left-most column (it's more of a button to the left of your left-most column that allows you to select the whole row) and select Paste from the contextual menu:


    If you need to paste more than 200 rows, click the Show SQL Pane icon from the toolbar to display the SQL statement behind the 200 rows being displayed. Simply change the 200 to a larger number of rows as required.



    Note that this will work up to a certain extent, but you may encounter times where have so much data to transfer that you need to use another method.

Import

You are able to import data from yet another datasource. The top end result is comparable towards the copy/paste method (i.e. information is copied across to the location databases), but importing the data is a lot more flexible and could be more appropriate on several occasions. For instance, you can choose info from several views and tables, and you can compose a question on the fly to import only the data you'll need.

To import data, right-click on the database and select Tasks > Import Data... and follow the Wizard from there.

The SQL Server Import and Export Wizard can copy data to and from any data source for which a managed .NET Framework data provider or a native OLE DB provider is available. These include:

  • SQL Server
  • Flat files
  • Microsoft Office Access
  • Microsoft Office Excel

You may also commence the SQL Server Import and Export Wizard in the Windows Commence menu, from inside of SQL Server Data Tools (SSDT), and through the command prompt (by running DTSWizard.exe which you will locate in both C:\Program Files\Microsoft SQL Server\100\DTS\Binn or in C:\Program Files\Microsoft SQL Server\120\DTS\Binn or other location depending on your configuration and push letter).

SQL Scripts

In many circumstances, you'll find it much more efficient to run a SQL script which contains the data you need to insert. You'll be able to make use of the SQL INSERT assertion to insert just the info you specify inside the statement.

SQL scripts are excellent for inserting static/reference data (like say, countries/regions). They are able to be saved and run once more any time it really is required (for example on another database). Scripts aren't generally so excellent for data that continually changes (like customer particulars). You almost certainly would not be maintaining a duplicate of outdated data within a SQL script. But you'll find often exceptions. For example, you may use this kind of a script to populate a client table inside your testing/development atmosphere.

More about SQL scripts arising.

Application/Website

Most SQL Server databases are the backend data storage for any front-end application. Customers from the application are responsible for including data towards the databases (as well as editing it). As a result, many of the tables in your database will likely be up-to-date via the application. Within this situation, the applying is updating the database utilizing SQL scripts.

The main difference between these scripts and the kinds we mentioned over is the fact that, the scripts being used within the software will probably be dynamic. They'll accept data as a parameter that is handed to the script. Hence the user can enter say, an e-mail tackle in to the application, and unbeknownst to him, the application runs a SQL script that requires his e-mail tackle, validates it, provides it to the script, and when it passes each of the business/validation principles, inserts it in to the database.

These SQL scripts may be placed directly into your site code (PHP, ColdFusion and so forth), or they're able to be stored within the databases as Saved Processes or Views and operate only when the application says so.




SQL 2014 Hosting Tutorial - ASPHostPortal.com :: The Best Way to Create a Table in SQL Server 2014

clock October 24, 2014 07:04 by author Ben

Seeing as our database is actually a task-tracker database, let us get in touch with our initial table "Tasks". This desk will hold all duties - regardless of their status (eg. done, to do, in progress, and so on). Then we are able to develop yet another table known as "Status". Then when our tables contain information, we'll be capable of operate queries towards these tables to discover what jobs have to be carried out and which of them are in a offered position, and so on.

But let's not get forward of ourselves. Let's develop our first table.

They are the measures to make a table in a SQL Server 2014 databases using SQL Server Administration Studio (SSMS).

  1. Ensuring you have the correct database expanded (in our case, the TaskTracker database), right click on the Tables icon and select Table... from the contextual menu:

  2. A new table will open in Design view. While you have this screen open, do the following:
    • Utilizing the values in the screenshot, full the details inside the Column Name column, data Type column, and allow Nulls column.
    • Make the TaskId column a Primary Key discipline by right-clicking the button close to TaskId (i.e. exactly the same area exactly where the true secret seems in the subsequent screenshot) and picking Set Primary Key.
    • Make the TaskId column an identity column by placing Is identity to yes (you can discover this feature underneath the Id Specification area in the bottom pane). Be aware that to set values in the base pane, you should pick the column name within the leading pane initial. We are environment this column to be an auto-number column - it's going to immediately produce a brand new amount for every document that is created.
    • Set the Default Price from the DateCreated column to (getdate()). (This will automatically insert the present date into that field for each new file).



    What we are carrying out is creating the column names, specifying the sort of data that may be entered into them, and setting default values. Restricting the data sort for each column is extremely crucial and aids maintain data integrity. For example, it may stop us from accidentally entering a task title right into a field for storing the existing day. SQL Server will avoid information from getting into tables in which the data does not adhere towards the rules that we have established for each column.

  3. Save the table by selecting File > Save Table_1 or by right-clicking on the Table's tab and selecting Save Table_1 from the contextual menu:

  4. When prompted, name your table (we'll call it Tasks):


Your New Table

Now that you've created a new table, it will appear under your database in the Tables section. If you don't see it immediately, try refreshing the Object Explorer. You can do this by right-clicking on Tables and clicking Refresh from the contextual menu:

Here's a screenshot of the table with the Columns folder expanded:




Cheap SQL 2014 Hosting with ASPHostPortal.com:: How to Create Cron Job to Backup Your SQL Database

clock June 20, 2014 08:31 by author Ben

Cron Jobs are used for scheduling tasks to run on the server. They're most commonly used for automating system maintenance or administration. However, they are also relevant to web application development. There are many situations when a web application may need certain tasks to run periodically. A cron job/scheduled task is a system-side automatic task that can be configured to run for an infinite number of times at a given interval. Cron/scheduled task allows you to schedule a command or a script to run at a specific time of day, or on a specific day of the week or at a particular time of day on a specific day of the month. It allow for scheduling down to the minute and up to an annual event.

Here is the simple step how to create Cron Job to Backup Your SQL Database :

Step 1. Create a folder to store the backup in your FTP application
Open your ftp application and connect to the account that has the database you want to back up. Create a folder outside of the webcontent called "backups".
fs1-n02stor1wc1dfw2382489382489www.yoursite.combackups

Step 2. Set folder permissions in your FTP application
Right click the folder and add all write permissions. If your Ftp software doesn't do this then checkout the free FTP client called FileZilla.

Step 3. Create a stored procedure that performs the backup with an input parameter for the filename
Connect to your database using a client  and run a query like this. The procedure will be named FullBackup in this example

set ANSI_NULLS ON
set QUOTED_IDENTIFIER ON
GO

CREATE PROCEDURE [dbo].[FullBackup]
 @FileName  nvarchar(256)
AS
BEGIN

SET NOCOUNT ON;

BACKUP DATABASE [123456_YourDatabase] TO  DISK = @FileName WITH NOFORMAT, NOINIT,  NAME = N'Full Database Backup', SKIP, NOREWIND, NOUNLOAD,  STATS = 10

END


Step 4. Create a web page that executes the stored procedure
You can use php or asp.net for this also. For something this simple, you can use classic asp. That way there is no .dll to deal with and no application restart needed.  Now create a new asp page and called it backupdb.asp.  The contents of the file are as follows. When you are done, upload this file to a folder in your content area.
This script will generate a filename based on the date. If a backup already exists for that date it will increment a version counter until a fresh filename is found. This script will generate 1 file per execution. Modify as needed if you want to increment a single file.  Edit the location & connection string to work for you.

<[email protected]="VBSCRIPT" CODEPAGE="65001"%>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<title>Untitled Document</title>
</head>

<body>

<%
    dim thismonth, thisday, thisyear, location, filelename, ver, extention, abolutespath

thismonth= datepart("m", now())
thisday=datepart("d", now())
thisyear=datepart("yyyy",now())

location="fs1-n03stor1wc1dfw8382492382489www.yoursite.combackups"
filename="dbBackup-" & thismonth & "-" & thisday & "-" & thisyear & "_"
ver=1
extention=".bak"

absolutepath=location & filename & ver & extention

    set fso = Server.CreateObject("Scripting.FileSystemObject")
 
while (fso.FileExists(absolutepath)=True)
ver=ver+1
absolutepath=location & filename & ver & extention
wend

    Set cn = Server.CreateObject("ADODB.Connection")
    cn.connectionString= "Provider=SQLNCLI;Server=mssql05-01.wc1;Database=123456_YourDatabase;Uid=123456_YourUsername; Pwd=Yourpassword;"
   cn.open

   Set cmd = Server.CreateObject("ADODB.Command")
   Set cmd.ActiveConnection = cn
   cmd.CommandText = "FullBackup"
   cmd.CommandType = 4 'adCmdStoredProc
 
   cmd.Parameters.Refresh
   cmd.Parameters(1) = absolutepath
 
   cmd.Execute

   cn.close
 
%>

   Execution complete:  Filename=<%= filename & ver & extention%>

</body>
</html>


Step 5.  Schedule a Cron job to call the web page.

Access your control panel and go to the features tab of the site with the database. Choose http as the language, Enter the URL to the asp script and your email and Schedule the job for a daily run at an off hour.

Our Special SQL Server 2014 Hosting Complete Features

What we think makes ASPHostPortal.com so compelling is how deeply integrated all the pieces are. We integrate and centralize everything--from the systems to the control panel software to the process of buying a domain name. For us, that means we can innovate literally everywhere. We've put the guys who develop the software and the admins who watch over the server right next to the 24-hour Fanatical Support team, so we all learn from each other.

Full Remote Access - We allow you full remote connectivity to your SQL Server 2014 Hosting database and do not restrict access in any way.
Easily transfer your existing SQL Server database - With our SQL Server hosting package, there's no need to rebuild your database from scratch should you wish to transfer an existing SQL Server database to us. If you already have a database hosted elsewhere, you can easily transfer the contents of your database using SQL Server Management Studio which is fully supported by our packages. SSMS provides you with an Import/Export wizard which you can use to upload your data and stored procedures with a couple of clicks.



About ASPHostPortal.com

We’re a company that works differently to most. Value is what we output and help our customers achieve, not how much money we put in the bank. It’s not because we are altruistic. It’s based on an even simpler principle. "Do good things, and good things will come to you".

Success for us is something that is continually experienced, not something that is reached. For us it is all about the experience – more than the journey. Life is a continual experience. We see the Internet as being an incredible amplifier to the experience of life for all of us. It can help humanity come together to explode in knowledge exploration and discussion. It is continual enlightenment of new ideas, experiences, and passions


Author Link


Corporate Address (Location)

ASPHostPortal
170 W 56th Street, Suite 121
New York, NY 10019
United States

Sign in