Windows 2012 Hosting - MVC 6 and SQL 2014 BLOG

Tutorial and Articles about Windows Hosting, SQL Hosting, MVC Hosting, and Silverlight Hosting

Windows 2012 Hosting - ASPHostPortal :: Things to Know About Deduplication in Windows Server 2012

clock August 15, 2013 08:14 by author Jervis

Talk to most administrators about deduplication and the usual response is: Why? Disk space is getting cheaper all the time, with I/O speeds ramping up along with it. The discussion often ends there with a shrug.

But the problem isn’t how much you’re storing or how fast you can get to it. The problem is whether the improvements in storage per gigabyte or I/O throughputs are being outpaced by the amount of data being stored in your organization. The more we can store, the more we do store. And while deduplication is not a magic bullet, it is one of many strategies that can be used to cut into data storage demands.

Microsoft added a deduplication subsystem feature in Windows Server 2012, which provides a way to perform deduplication on all volumes managed by a given instance of Windows Server. Instead of relegating deduplication duty to a piece of hardware or a software layer, it’s done in the OS on both a block and file level — meaning that many kinds of data (such as multiple instances of a virtual machine) can be successfully deduplicated with minimal overhead.

If you plan to implement Windows Server 2012 deduplication technology, be sure you understand these seven points:

1. Deduplication is not enabled by default

Don’t upgrade to Windows Server 2012 and expect to see space savings automatically appear. Deduplication is treated as a file-and-storage service feature, rather than a core OS component. To that end, you must enable it and manually configure it in Server Roles | File And Storage Services | File and iSCSI Services. Once enabled, it also needs to be configured on a volume-by-volume basis.

2. Deduplication won’t burden the system

Microsoft put a fair amount of thought into setting up deduplication so it has a small system footprint and can run even on servers that have a heavy load. Here are three reasons why:

a. Content is only deduplicated after n number of days, with n being 5 by default, but this is user-configurable. This time delay keeps the deduplicator from trying to process content that is currently and aggressively being used or from processing files as they’re being written to disk (which would constitute a major performance hit).

b. Deduplication can be constrained by directory or file type. If you want to exclude certain kinds of files or folders from deduplication, you can specify those as well.

c. The deduplication process is self-throttling and can be run at varying priority levels. You can set the actual deduplication process to run at low priority and it will pause itself if the system is under heavy load. You can also set a window of time for the deduplicator to run at full speed, during off-hours, for example.

This way, with a little admin oversight, deduplication can be put into place on even a busy server and not impact its performance.

3. Deduplicated volumes are ‘atomic units’

‘Atomic units’ mean that all of the deduplication information about a given volume is kept on that volume, so it can be moved without injury to another system that supports deduplication. If you move it to a system that doesn’t have deduplication, you’ll only be able to see the nondeduplicated files. The best rule is not to move a deduplicated volume unless it’s to another Windows Server 2012 machine.

4. Deduplication works with BranchCache

If you have a branch server also running deduplication, it shares data about deduped files with the central server and thus cuts down on the amount of data needed to be sent between the two.

5. Backing up deduplicated volumes can be tricky

A block-based backup solution — e.g., a disk-image backup method — should work as-is and will preserve all deduplication data.

File-based backups will also work, but they won’t preserve deduplication data unless they’re dedupe-aware. They’ll back up everything in its original, discrete, undeduplicated form. What’s more, this means backup media should be large enough to hold the undeduplicated data as well.

The native Windows Server Backup solution is dedupe-aware, although any third-party backup products for Windows Server 2012 should be checked to see if deduplication awareness is either present or being added in a future revision.

6. More is better when it comes to cores and memory

Microsoft recommends devoting at least one CPU core and 350 MB of free memory to process one volume at a time, with around 100 GB of storage processed in an hour (without interruptions) or around 2 TB a day. The more parallelism you have to spare, the more volumes you can simultaneously process.

7. Deduplication mileage may vary

Microsoft has crunched its own numbers and found that the nature of the deployment affected the amount of space savings. Multiple OS instances on virtual hard disks (VHDs) exhibited a great deal of savings because of the amount of redundant material between them; user folders, less so.

In its rundown of what are good and bad candidates for deduping, Microsoft notes that live Exchange Server databases are actually poor candidates. This sounds counterintuitive; you’d think an Exchange mailbox database might have a lot of redundant data in it. But the constantly changing nature of data (messages being moved, deleted, created, etc.) offsets the gains in throughput and storage savings made by deduplication. However, an Exchange Server backup volume is a better candidate since it changes less often and can be deduplicated without visibly slowing things down.

How much you actually get from deduplication in your particular setting is the real test for whether to use it. Therefore, it’s best to start provisionally, perhaps on a staging server where you can set the “crawl rate” for deduplication as high as needed, see how much space savings you get with your data and then establish a schedule for performing deduplication on your own live servers.

 



Cloud Hosting - Cloud Computing Advantages

clock January 30, 2013 09:39 by author andy_yo

Cloud computing is a disruptive technology that is changing the way enterprises look to meet their IT hardware and software requirements. Cloud computing is a mix of the latest ideas, technology and delivery models including Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS), and other models in the IT sector that use the Internet for delivering services to the user. Users can access infrastructure namely, the servers, software, and data center space or network equipment; the required computing platform and solution stack for a building an application, covering the cycle of development, testing, deployment, hosting and maintenance; and also most of the regular software applications; these are all provided cheaply and efficiently over the Internet.

About ASPHostPortal.com

ASPHostPortal.com is Microsoft No #1 Recommended Windows and ASP.NET Spotlight Hosting Partner in United States. Microsoft presents this award to ASPHostPortal.com for ability to support the latest Microsoft and ASP.NET technology, such as: WebMatrix, WebDeploy, Visual Studio 2012, ASP.NET 4.5, ASP.NET MVC 4.0, Silverlight 5 and Visual Studio Lightswitch. Click here for more information

Some of the benefits of the Cloud are listed below:

Decreased Costs: The Cloud eliminates the need for each user to invest in stand-alone servers or software that is capital intensive, but under-utilized most of the time. As technological innovations take place, these resources become obsolete and must be replaced with the latest in order to ensure operational efficiency – requiring more capital investment – and the cycle repeats. The Cloud eliminates the need for such ‘replacement’ capital expenditure.
Many users share a Cloud leading to distributed costs and economies of scale as resources including real estate, bandwidth, and power, are centralized. The enterprise also saves on overheads such as management costs, data storage costs, costs of software updates, and quality control and is able to use Cloud services at economical rates.

Scalability and Speed: Enterprises no longer have to invest time in buying and setting up the hardware, software and other resources necessary for a new application. They can quickly scale up or scale down their usage of services on the Cloud as per market demands, during hours of maximum activity, while launching sales campaigns, etc. Cloud services are most usually reliable, since many service providers have data centers in multiple locations for keeping the processing near users.

Innovation: Enterprises can focus on innovation, as they do not have to own or manage resources. Cloud computing facilitates faster prototype development, testing and validation. Research and development projects or activities where users have to collaborate for a task/project are especially benefited.

Convenience: Sharing of infrastructure and costs ensures low overheads and immediate availability of services. Payments are billed on the basis of actual consumption only. Details of billing are made available by the service provider also serves to check costs.
Other than an Internet-connected device, special equipment or specially-trained manpower is not needed. One-off tasks can be performed on the Cloud. High-speed bandwidth ensures real-time response from infrastructure located at different sites.

Location Independence: Service providers can set up infrastructure in areas with lower overheads and pass on the benefit. They can set up multiple redundant sites to facilitate business continuity and disaster recovery. This helps the enterprise cut costs further.

Optimal Resource Utilization: Servers, storage and network resources are better utilized as the Cloud is shared by multiple users, thus cutting down on waste at a global level. Cloud computing is more environment-friendly and energy efficient. Down-time is cut and optimization of resources across enterprises on the Cloud is achieved.

Flexibility: Users can opt out at will and thus gain a high level of operational flexibility.  The services are covered by service level agreements and the service provider is required to pay a penalty if the quality agreed to is not provided.

Device Independence: Applications provided through the Cloud can be accessed from any device – a computer, a smartphone, an iPad, etc. Any device that has access to the Internet can leverage the power of the Cloud.



ASPHostPortal.com Announces Newest Service Windows Server 2012 Hosting

clock October 10, 2012 07:23 by author Jervis

ASPHostPortal is a premiere web hosting company that specialized in Windows and ASP.NET-based hosting, proudly announces new Microsoft product, Windows Server 2012 hosting to all new and existing customers. The newly released server operating system offers a number of features that can be utilized to benefit developers, resellers and businesses.

Windows Server 2012 offers new and improved features which enable multi-user infrastructure where storage resources, networking and compute are completely remote from other users. There are a number of key features that customers will find useful, including support for asp.net 4.5, Internet Information Services 8.0 (IIS), compatibility with Visual Studio 2012, Visual Studio Express 2012, support for ASP.NET MVC 4, and Entity Framework 5. Other key new features include dynamic IP restriction to help prevent DoS attacks, support WebSockets and node.js, and also CPU Throttling to ensure isolated each client's server usage and Application Initialization to improve user experience of first requests.

“We have always had a great appreciation for the products that Microsoft Offers. With the launched of Windows Server 2012 hosting services, entrepreneurs and organization will be able to build their impressive website to be on the top of the competition.” Said Dean Thomas, Manager at ASPHostPortal. “Within Windows Server 2012 hosting packages, users will have the ability to use Hyper-V in assorted configurations, improved isolation and security, fortified access controls to files and processes and the capability of managing servers as a group.”

ASPHostPortal is one of the Microsoft recommended hosting partner that provide most stable and reliable web hosting platform. With the new launch of Windows Server 2012 into its feature, it will continue to keep ASPHostPortal as one of the front runners in the web hosting market. For more information about new Windows Server 2012 hosting, please visit http://www.asphostportal.com.

About ASPHostPortal.com:

ASPHostPortal.com is a hosting company that best support in Windows and ASP.NET-based hosting. Services include shared hosting, reseller hosting, and sharepoint hosting, with specialty in ASP.NET, SQL Server, and architecting highly scalable solutions. As a leading small to mid-sized business web hosting provider, ASPHostPortal strive to offer the most technologically advanced hosting solutions available to all customers across the world. Security, reliability, and performance are at the core of hosting operations to ensure each site and/or application hosted is highly secured and performs at optimum level.

 



About ASPHostPortal.com

We’re a company that works differently to most. Value is what we output and help our customers achieve, not how much money we put in the bank. It’s not because we are altruistic. It’s based on an even simpler principle. "Do good things, and good things will come to you".

Success for us is something that is continually experienced, not something that is reached. For us it is all about the experience – more than the journey. Life is a continual experience. We see the Internet as being an incredible amplifier to the experience of life for all of us. It can help humanity come together to explode in knowledge exploration and discussion. It is continual enlightenment of new ideas, experiences, and passions


Author Link


Corporate Address (Location)

ASPHostPortal
170 W 56th Street, Suite 121
New York, NY 10019
United States

Sign in