Windows 2012 Hosting - MVC 6 and SQL 2014 BLOG

Tutorial and Articles about Windows Hosting, SQL Hosting, MVC Hosting, and Silverlight Hosting

Windows 2012 Hosting :: Introduction Dynamic Access Control 2012

clock September 2, 2013 12:09 by author Ben

Do you know what is Dynamic Access Control??
Microsoft Dynamic Access Control (DAC) is a data governance tool in Windows Server 2012 that lets administrators control access settings. DAC uses centralized policies to let administrators review who has access to individual files. Files can be manually or automatically classified.

Windows Server 2012 Dynamic Access Control basically organize information automatically on file servers in order to meet business needs and regulatory requirements .

With the use of technology in the DAC classification , organization or company can identify or provide " tags " or labels to files on the file server . This capability to control access to files that were tagged through centralized access policies , perform audits and related reporting events related to access or attempts to access , use RMS ( Rights Management Services ) to encrypt Office documents so that the documents safe when the data out of the file server .

A number of the features found in Windows Server 2012 that are beneficial to the administrator in this regard :

  •     File owner , or the owner can directly provide information " tags " or labels to their own information , so do not need to be done by the administrator .
  •     Apply to the central access policy files ( information ) that has been in the " tag " or label given
  •     Provide " access denied remediation " when the user can not access the information .
  •     Configure the central audit trail records policies for information access ( access logs ) that can later be used for analysis and forensic needs .
  •     Protect certain sensitive information to the protection of the RMS automatically.


Label or tag used to identify the protected file can be used to classify files in logic . In Windows Server 2012 , this label can be applied in four ways :

  •     Based on location . When files are stored on a file server , the file inherits the label of its parent folder .
  •     Manually . Users and administrators can manually file labeling .
  •     Automatic . Files can be automatically labeled based on content or other characteristics .
  •     With the application , using the API for labeling file maintained by the application .

 



Windows 2012 Hosting - ASPHostPortal :: Things to Know About Deduplication in Windows Server 2012

clock August 15, 2013 08:14 by author Jervis

Talk to most administrators about deduplication and the usual response is: Why? Disk space is getting cheaper all the time, with I/O speeds ramping up along with it. The discussion often ends there with a shrug.

But the problem isn’t how much you’re storing or how fast you can get to it. The problem is whether the improvements in storage per gigabyte or I/O throughputs are being outpaced by the amount of data being stored in your organization. The more we can store, the more we do store. And while deduplication is not a magic bullet, it is one of many strategies that can be used to cut into data storage demands.

Microsoft added a deduplication subsystem feature in Windows Server 2012, which provides a way to perform deduplication on all volumes managed by a given instance of Windows Server. Instead of relegating deduplication duty to a piece of hardware or a software layer, it’s done in the OS on both a block and file level — meaning that many kinds of data (such as multiple instances of a virtual machine) can be successfully deduplicated with minimal overhead.

If you plan to implement Windows Server 2012 deduplication technology, be sure you understand these seven points:

1. Deduplication is not enabled by default

Don’t upgrade to Windows Server 2012 and expect to see space savings automatically appear. Deduplication is treated as a file-and-storage service feature, rather than a core OS component. To that end, you must enable it and manually configure it in Server Roles | File And Storage Services | File and iSCSI Services. Once enabled, it also needs to be configured on a volume-by-volume basis.

2. Deduplication won’t burden the system

Microsoft put a fair amount of thought into setting up deduplication so it has a small system footprint and can run even on servers that have a heavy load. Here are three reasons why:

a. Content is only deduplicated after n number of days, with n being 5 by default, but this is user-configurable. This time delay keeps the deduplicator from trying to process content that is currently and aggressively being used or from processing files as they’re being written to disk (which would constitute a major performance hit).

b. Deduplication can be constrained by directory or file type. If you want to exclude certain kinds of files or folders from deduplication, you can specify those as well.

c. The deduplication process is self-throttling and can be run at varying priority levels. You can set the actual deduplication process to run at low priority and it will pause itself if the system is under heavy load. You can also set a window of time for the deduplicator to run at full speed, during off-hours, for example.

This way, with a little admin oversight, deduplication can be put into place on even a busy server and not impact its performance.

3. Deduplicated volumes are ‘atomic units’

‘Atomic units’ mean that all of the deduplication information about a given volume is kept on that volume, so it can be moved without injury to another system that supports deduplication. If you move it to a system that doesn’t have deduplication, you’ll only be able to see the nondeduplicated files. The best rule is not to move a deduplicated volume unless it’s to another Windows Server 2012 machine.

4. Deduplication works with BranchCache

If you have a branch server also running deduplication, it shares data about deduped files with the central server and thus cuts down on the amount of data needed to be sent between the two.

5. Backing up deduplicated volumes can be tricky

A block-based backup solution — e.g., a disk-image backup method — should work as-is and will preserve all deduplication data.

File-based backups will also work, but they won’t preserve deduplication data unless they’re dedupe-aware. They’ll back up everything in its original, discrete, undeduplicated form. What’s more, this means backup media should be large enough to hold the undeduplicated data as well.

The native Windows Server Backup solution is dedupe-aware, although any third-party backup products for Windows Server 2012 should be checked to see if deduplication awareness is either present or being added in a future revision.

6. More is better when it comes to cores and memory

Microsoft recommends devoting at least one CPU core and 350 MB of free memory to process one volume at a time, with around 100 GB of storage processed in an hour (without interruptions) or around 2 TB a day. The more parallelism you have to spare, the more volumes you can simultaneously process.

7. Deduplication mileage may vary

Microsoft has crunched its own numbers and found that the nature of the deployment affected the amount of space savings. Multiple OS instances on virtual hard disks (VHDs) exhibited a great deal of savings because of the amount of redundant material between them; user folders, less so.

In its rundown of what are good and bad candidates for deduping, Microsoft notes that live Exchange Server databases are actually poor candidates. This sounds counterintuitive; you’d think an Exchange mailbox database might have a lot of redundant data in it. But the constantly changing nature of data (messages being moved, deleted, created, etc.) offsets the gains in throughput and storage savings made by deduplication. However, an Exchange Server backup volume is a better candidate since it changes less often and can be deduplicated without visibly slowing things down.

How much you actually get from deduplication in your particular setting is the real test for whether to use it. Therefore, it’s best to start provisionally, perhaps on a staging server where you can set the “crawl rate” for deduplication as high as needed, see how much space savings you get with your data and then establish a schedule for performing deduplication on your own live servers.

 



ASPHostPortal.com Announces Newest Service Windows Server 2012 Hosting

clock October 10, 2012 07:23 by author Jervis

ASPHostPortal is a premiere web hosting company that specialized in Windows and ASP.NET-based hosting, proudly announces new Microsoft product, Windows Server 2012 hosting to all new and existing customers. The newly released server operating system offers a number of features that can be utilized to benefit developers, resellers and businesses.

Windows Server 2012 offers new and improved features which enable multi-user infrastructure where storage resources, networking and compute are completely remote from other users. There are a number of key features that customers will find useful, including support for asp.net 4.5, Internet Information Services 8.0 (IIS), compatibility with Visual Studio 2012, Visual Studio Express 2012, support for ASP.NET MVC 4, and Entity Framework 5. Other key new features include dynamic IP restriction to help prevent DoS attacks, support WebSockets and node.js, and also CPU Throttling to ensure isolated each client's server usage and Application Initialization to improve user experience of first requests.

“We have always had a great appreciation for the products that Microsoft Offers. With the launched of Windows Server 2012 hosting services, entrepreneurs and organization will be able to build their impressive website to be on the top of the competition.” Said Dean Thomas, Manager at ASPHostPortal. “Within Windows Server 2012 hosting packages, users will have the ability to use Hyper-V in assorted configurations, improved isolation and security, fortified access controls to files and processes and the capability of managing servers as a group.”

ASPHostPortal is one of the Microsoft recommended hosting partner that provide most stable and reliable web hosting platform. With the new launch of Windows Server 2012 into its feature, it will continue to keep ASPHostPortal as one of the front runners in the web hosting market. For more information about new Windows Server 2012 hosting, please visit http://www.asphostportal.com.

About ASPHostPortal.com:

ASPHostPortal.com is a hosting company that best support in Windows and ASP.NET-based hosting. Services include shared hosting, reseller hosting, and sharepoint hosting, with specialty in ASP.NET, SQL Server, and architecting highly scalable solutions. As a leading small to mid-sized business web hosting provider, ASPHostPortal strive to offer the most technologically advanced hosting solutions available to all customers across the world. Security, reliability, and performance are at the core of hosting operations to ensure each site and/or application hosted is highly secured and performs at optimum level.

 



About ASPHostPortal.com

We’re a company that works differently to most. Value is what we output and help our customers achieve, not how much money we put in the bank. It’s not because we are altruistic. It’s based on an even simpler principle. "Do good things, and good things will come to you".

Success for us is something that is continually experienced, not something that is reached. For us it is all about the experience – more than the journey. Life is a continual experience. We see the Internet as being an incredible amplifier to the experience of life for all of us. It can help humanity come together to explode in knowledge exploration and discussion. It is continual enlightenment of new ideas, experiences, and passions


Author Link


Corporate Address (Location)

ASPHostPortal
170 W 56th Street, Suite 121
New York, NY 10019
United States

Sign in