Windows 2012 Hosting - MVC 6 and SQL 2014 BLOG

Tutorial and Articles about Windows Hosting, SQL Hosting, MVC Hosting, and Silverlight Hosting

Windows 2012 Hosting :: Hyper-v 3.0 Network Virtualization on Windows 2012

clock August 28, 2013 10:21 by author Mike

Windows Server introduces a slew of new technologies. These technologies enable Windows Server systems and virtual environments to meet all manner of new requirements and scenarios, including private and public cloud implementations. Often, this type of scenario involves a single infrastructure that's shared by different business units or even different organizations. 

In this article, I want to describe Network virtualization. Other great capabilities include a new site-to-site VPN solution; huge enhancements to the Server Message Block (SMB) protocol, enabling VMs to run from a Server 8 file share; native NIC teaming; and consistent device naming. But I want to focus on the major network technologies that most affect virtualization.

Virtualization has always striven to abstract one resource layer from another, giving improved functionality and portability. But networking hasn't embraced this goal, and VMs are tied to the networking configuration on the host that runs them. Microsoft System Center Virtual Machine Manager (VMM) 2012 tries to link VMs to physical networks through its logical networks feature, which lets you create logical networks such as Development, Production, and Backup. You can then create IP subnets and virtual LANs (VLANs) for each physical location that has a connection to a logical network. This capability lets you create VMs that automatically connect to the Production network, for example; VMM works out the actual Hyper-V switch that should be used and the IP scheme and VLAN tag, based on the actual location to which the VM is deployed.

This feature is great. But it still doesn't help in scenarios in which I might be hosting multiple tenants that require their own IP schemes, or even one tenant that requires VMs to move between different locations or between private and public clouds, without changing IP addresses or policies that relate to the network. Typically, public cloud providers require clients to use the hosted IP scheme, which is an issue for flexible migration between on-premises and off-premises hosting.

Both these scenarios require the network to be virtualized, and the virtual network must believe that it wholly owns the network fabric, in the same way that a VM believes it owns the hardware on which it runs. VMs don't see other VMs, and virtual networks shouldn't see or care about other virtual networks on the same physical fabric, even when they have overlapping IP schemes. Network isolation is a crucial part of network virtualization, especially when you consider hosted scenarios. If I'm hosting Pepsi and Coca-Cola on the same physical infrastructure, I need to be very sure that they can't see each other's virtual networks. They need complete network isolation.

This virtual network capability is enabled through the use of two IP addresses for each VM and a virtual subnet identifier that indicates the virtual network to which a particular VM belongs. The first IP address is the standard address that's configured within the VM and is referred to as the customer address (using IEEE terms). The second IP address is the address that the VM communicates over the physical network and is known as the provider address.

In the example that Figure 1 shows, we have one physical fabric. Running on that fabric are two separate organizations: red and blue. Each organization has its own IP scheme, which can overlap, and the virtual networks can span multiple physical locations. Each VM that is part of the virtual red or blue network has its own customer address. A separate provider address is used to send the actual IP traffic over the physical fabric.

Figure 1: Virtual networking example


Figure 1: Virtual networking example 

You can see that the physical fabric has the network and compute resources and that multiple VMs run across the hosts and sites. The color of the VM coordinates with its virtual network (red or blue). Even though the VMs are distributed across hosts and locations, the hosts in the virtual networks are completely isolated from the other virtual networks with their own IP schemes.

Two solutions-IP rewrite and Generic Routing Encapsulation (GRE)-enable network virtualization in Server 8. Both solutions allow completely separate virtual networks with their own IP schemes (which can overlap) to run over one shared fabric.

IP rewrite. The first option is IP rewrite, which does exactly what the name suggests. Each VM has two IP addresses: a customer address, which is configured within the VM, and a provider address, which is used for the actual packet transmission over the network. The Hyper-V switch looks at the traffic that the VM is sending out, looks at the virtual subnet ID to identify the correct virtual network, and rewrites the IP address source and target from the customer addresses to the corresponding provider addresses. This approach requires many IP addresses from the provider address pool because every VM needs its own provider address. The good news is that because the IP packet isn't being modified (apart from the address), hardware offloads such as virtual machine queue (VMQ), checksum, and receive-side scaling (RSS) continue to function. IP rewrite adds very little overhead to the network process and gives very high performance.

Figure 2 shows the IP rewrite process, along with the mapping table that the Hyper-V host maintains. The Hyper-V host maintains the mapping of customer-to-provider addresses, each of which is unique for each VM. The source and destination IP addresses of the original packet are changed as the packet is sent via the Hyper-V switch. The arrows in the figure show the flow of IP traffic.

Figure 2: IP rewrite process


Figure 2: IP rewrite process 

GRE. The second option is GRE, an Internet Engineering Task Force (IETF) standard. GRE wraps the originating packet, which uses the customer addresses, inside a packet that can be routed on the physical network by using the provider address and that includes the actual virtual subnet ID. Because the virtual subnet ID is included in the wrapper packet, VMs don't require their own provider addresses. The receiving host can identify the targeted VM based on the target customer address within the original packet and the virtual subnet ID in the wrapper packet. All the Hyper-V host on the originating VM needs to know is which Hyper-V host is running the target VM and can send the packet over the network.

The use of a shared provider address means that far fewer IP addresses from the provider IP pools are needed. This is good news for IP management and the network infrastructure. However, there is a downside, at least as of this writing. Because the original packet is wrapped inside the GRE packet, any kind of NIC offloading will break. The offloads won't understand the new packet format. The good news is that many major hardware manufacturers are in the process of adding support for GRE to all their network equipment, enabling offloading even when GRE is used.

Figure 3 shows the GRE process. The Hyper-V host still maintains the mapping of customer-to-provider address, but this time the provider address is per Hyper-V host virtual switch. The original packet is unchanged. Rather, the packet is wrapped in the GRE packet as it passes through the Hyper-V switch, which includes the correct source and destination provider addresses in addition to the virtual subnet ID.


Figure 3: GRE 

In both technologies, virtualization policies are used between all the Hyper-V hosts that participate in a specific virtual network. These policies enable the routing of the customer address across the physical fabric and track the customer-to-provider address mapping. The virtualization policies can also define the virtual networks that are allowed to communicate with other virtual networks. The virtualization policies can be configured by using Windows PowerShell, which is a common direction for Server 8. This makes sense: When you consider massive scale and automation, the current GUI really isn't sufficient. The challenge when using native PowerShell commands is the synchronous orchestration of the virtual-network configuration across all participating Hyper-V hosts.

Both options sound great, but which one should you use? GRE should be the network virtualization technology of choice because it's faster than IP rewrite. The network hardware supports GRE, which is important because otherwise GRE would break offloading, and software would need to perform offloading, which would be very slow. Also, because of the reduced provider address requirements, GRE places fewer burdens on the network infrastructure. However, until the networking equipment supports GRE, you should use IP rewrite, which requires no changes on the network infrastructure equipment.

 



Windows 2012 Hosting - ASPHostPortal :: Things to Know About Deduplication in Windows Server 2012

clock August 15, 2013 08:14 by author Jervis

Talk to most administrators about deduplication and the usual response is: Why? Disk space is getting cheaper all the time, with I/O speeds ramping up along with it. The discussion often ends there with a shrug.

But the problem isn’t how much you’re storing or how fast you can get to it. The problem is whether the improvements in storage per gigabyte or I/O throughputs are being outpaced by the amount of data being stored in your organization. The more we can store, the more we do store. And while deduplication is not a magic bullet, it is one of many strategies that can be used to cut into data storage demands.

Microsoft added a deduplication subsystem feature in Windows Server 2012, which provides a way to perform deduplication on all volumes managed by a given instance of Windows Server. Instead of relegating deduplication duty to a piece of hardware or a software layer, it’s done in the OS on both a block and file level — meaning that many kinds of data (such as multiple instances of a virtual machine) can be successfully deduplicated with minimal overhead.

If you plan to implement Windows Server 2012 deduplication technology, be sure you understand these seven points:

1. Deduplication is not enabled by default

Don’t upgrade to Windows Server 2012 and expect to see space savings automatically appear. Deduplication is treated as a file-and-storage service feature, rather than a core OS component. To that end, you must enable it and manually configure it in Server Roles | File And Storage Services | File and iSCSI Services. Once enabled, it also needs to be configured on a volume-by-volume basis.

2. Deduplication won’t burden the system

Microsoft put a fair amount of thought into setting up deduplication so it has a small system footprint and can run even on servers that have a heavy load. Here are three reasons why:

a. Content is only deduplicated after n number of days, with n being 5 by default, but this is user-configurable. This time delay keeps the deduplicator from trying to process content that is currently and aggressively being used or from processing files as they’re being written to disk (which would constitute a major performance hit).

b. Deduplication can be constrained by directory or file type. If you want to exclude certain kinds of files or folders from deduplication, you can specify those as well.

c. The deduplication process is self-throttling and can be run at varying priority levels. You can set the actual deduplication process to run at low priority and it will pause itself if the system is under heavy load. You can also set a window of time for the deduplicator to run at full speed, during off-hours, for example.

This way, with a little admin oversight, deduplication can be put into place on even a busy server and not impact its performance.

3. Deduplicated volumes are ‘atomic units’

‘Atomic units’ mean that all of the deduplication information about a given volume is kept on that volume, so it can be moved without injury to another system that supports deduplication. If you move it to a system that doesn’t have deduplication, you’ll only be able to see the nondeduplicated files. The best rule is not to move a deduplicated volume unless it’s to another Windows Server 2012 machine.

4. Deduplication works with BranchCache

If you have a branch server also running deduplication, it shares data about deduped files with the central server and thus cuts down on the amount of data needed to be sent between the two.

5. Backing up deduplicated volumes can be tricky

A block-based backup solution — e.g., a disk-image backup method — should work as-is and will preserve all deduplication data.

File-based backups will also work, but they won’t preserve deduplication data unless they’re dedupe-aware. They’ll back up everything in its original, discrete, undeduplicated form. What’s more, this means backup media should be large enough to hold the undeduplicated data as well.

The native Windows Server Backup solution is dedupe-aware, although any third-party backup products for Windows Server 2012 should be checked to see if deduplication awareness is either present or being added in a future revision.

6. More is better when it comes to cores and memory

Microsoft recommends devoting at least one CPU core and 350 MB of free memory to process one volume at a time, with around 100 GB of storage processed in an hour (without interruptions) or around 2 TB a day. The more parallelism you have to spare, the more volumes you can simultaneously process.

7. Deduplication mileage may vary

Microsoft has crunched its own numbers and found that the nature of the deployment affected the amount of space savings. Multiple OS instances on virtual hard disks (VHDs) exhibited a great deal of savings because of the amount of redundant material between them; user folders, less so.

In its rundown of what are good and bad candidates for deduping, Microsoft notes that live Exchange Server databases are actually poor candidates. This sounds counterintuitive; you’d think an Exchange mailbox database might have a lot of redundant data in it. But the constantly changing nature of data (messages being moved, deleted, created, etc.) offsets the gains in throughput and storage savings made by deduplication. However, an Exchange Server backup volume is a better candidate since it changes less often and can be deduplicated without visibly slowing things down.

How much you actually get from deduplication in your particular setting is the real test for whether to use it. Therefore, it’s best to start provisionally, perhaps on a staging server where you can set the “crawl rate” for deduplication as high as needed, see how much space savings you get with your data and then establish a schedule for performing deduplication on your own live servers.

 



Reporting Service (SSRS) 2008 Hosting :: SSRS Report Execution and Performance Enhancements

clock July 22, 2013 06:17 by author Mike

Reporting Service (SSRS) 2008 R2 allows us to execute reports in 3 modes:

  1. On Demand
  2. From Cache
  3. From Snapshots

 

On Demand

  • This is normal approach that we follow, by hitting a report server URL. Each time a report is run, data is returned from the database server and rendered to the report.
  • This approach ensures that our report is update and fresh.
  • The downside of this approach is that, if n users open up this report on their browsers, queries on the report are executed n times.
  • Thus this approach at times might slow down the server.

Cache

  • One of the performance enhancements techniques is to cache a report when it is initially run.
  • This means that if another user requests for the report, the same report is served to the user from the cache.
  • This avoids people querying the database server from each report rendering.
  • To make sure that people do not receive too much stale data, we can set a time to invalidate a cache.
  • This is a good performance enhancement technique for slow running reports.

Snapshots

  • Report snapshots are created at a particular schedule for certain parameters.
  • Please note that parameters cannot be changes on snapshot reports.
  • SSRS 2008 R2 allows to schedule the snapshot creation times.
  • Users can directly render a report from a snapshot. However please note that not all reports can have snapshots, especially the ones that prompt users for credentials.


Windows 2008 Hosting Tips :: Installing IIS 7.5 on Windows Server 2008/2008 R2

clock July 1, 2013 09:05 by author Mike

IIS (Internet Information Services) Version 7.5 – is a web server application and a set of feature extension modules created by Microsoft for use with Microsoft Windows. Released with Windows Server 2008 R2 and Windows 7. The installation procedure to install IIS 7.5 is slightly different depending on which operating system you are using, so you will need to make sure you follow the correct method below for your operating system.

Follow this steps to install:

  • Navigate to Start\Control Panel\Administrative Tools and open Server Manager.
  • In the tree menu on the left, select Roles.
  • In the main window, find the sub-section Roles Summary (it should be at the top) and click Add Roles.
  • On the pop-up window, read the security warnings and then select Next.
  • Select Web Server (IIS) from the list of check box options.
  • After reading the introduction to IIS, click Next to continue
  • From the check boxes, find the Application Development section and select CGI. This will allow us to install PHP later without any modifications.

 

  • Click Next to continue.
  • Click Install to begin the installation.
  • After installation has finished, click Close.

To check the installation, you can follow this steps:
- Navigate to Start\Control Panel\Administrative Tools and open Internet Information Services (IIS) Manager.

 

If it's installed properly you'll be introduced with the IIS Manager window, shown below.

That's it, IIS 7.5 is now installed and ready to be put into service.



WebMatrix Hosting – ASPHostPortal.com :: Quick Start screen

clock May 8, 2013 08:05 by author Ben

Microsoft has just introduced the new version of WebMatrix. When this post is written it’s still in BETA version.

if you were ASP.NET developer when it’s in 1.1 version, you might ever tried WebMatrix, it was free as it’s today. At that time there was no Express edition of Visual Studio and WebMatrix was the free tool to develop ASP.NET website if you didn’t want to buy Visual Studio.

What is this?
WebMatrix is everything you need to build Web sites using Windows. It includes IIS Developer Express (a development Web server), ASP.NET (a Web framework), and SQL Server Compact (an embedded database). It streamlines Web site development and makes it easy to start Web sites from popular open-source apps. The skills and code you develop with WebMatrix transition seamlessly to Visual Studio and SQL Server.

Why Use It?
You will use the same powerful Web server, database engine and web framework that will run your Web site on the Internet, which makes the transition from development to product seamless. Beyond ensuring everything just works, WebMatrix includes new features that make Web development easier.

Who’s it for?
WebMatrix is for developers, students, or just about anyone who just wants a small and simple way to build Web sites. Start coding, testing, and deploying your own Web sites without having to worry about configuring your own Web server, managing databases, or learning a lot of concepts. WebMatrix makes Web site development easy.

Code Without Boundaries
WebMatrix provides an easy way to get started with Web development. With an integrated code editor and a database editor, Web site and server management, search optimization, FTP publishing, and more, WebMatrix provides a fresh, new Web site development experience that seamlessly bridges all the key components you need in order to create, run, and deploy a Web site.

As your needs grow, the code and skills you develop can seamlessly transition to Visual Studio – Microsoft’s premier development suite.

WebMatrix – Quick Start screen
On first-time run of Microsoft WebMatrix, below screen will be the default screen on WebMatrix.

if you don’t like it you can disable it by giving “Do not show this screen on start-up” a check mark. It will then run My Sites as default screen and open the last site you’re working with.

Later you want the Quick Start screen to get back you can close the opened site it will bring you the Quick Start screen and you can remove the check mark if you want.

Okay let’s try the menu item one by one.

My Sites
It will open a dialog with all sites you ever worked with in the list. Select a site and click OK button to open, or you can just double click the site.

I found that there is no way to remove a site from the list at this Beta version. Maybe it will be added in the newer version… let’s hope… err suggest to them.

Site From Web Gallery
This option will open a dialog to select a website or application from Web Gallery. This is the same as installing community website or application using Web PI but in this Microsoft WebMatrix the website or application will not be installed in IIS but rather they will be stored in %user%\Documents\My Web Sites folder and will be run in IIS Developer Express by default when you’re working with the site on WebMatrix. (if you want you can also add the website or application to your computer IIS manually)

Site From Template
This is where all funs will start. You can create your own site from online templates listed there.

When you create a new site from template, you will find that there are new extensions .cshtml and .vbhtml – yep they are new extension introduced by Microsoft as ASP.NET Web Page and  it supports Razor Syntax.



$50.00 Cheap and Reliable Windows Cloud Server with ASPHostPortal.com

clock May 6, 2013 11:43 by author Jervis

ASPHostPortal.com is Microsoft No #1 Recommended Windows and ASP.NET Spotlight Hosting Partner in United States. Microsoft presents this award to ASPHostPortal.com for ability to support the latest Microsoft and ASP.NET technology, such as: WebMatrix, WebDeploy, Visual Studio 2012, ASP.NET 4.5, ASP.NET MVC 4.0, Silverlight 5 and Visual Studio Lightswitch. Click here for details. Now, ASPHostPortal.com proudly announces the most affordable, reliable and cheap Windows Cloud Server on our hosting environment.

 

Microsoft literally built in to Windows Server 2012 hundreds of enhancements and new features. Below are several points highlighted by Microsoft:

  1. Powerful. It delivers the power of many servers, with the simplicity of one. Windows Server 2012 offers you excellent economics by providing an integrated, highly available, and easy-to-manage multiple-server platform.
  2. Continuous availability. New and improved features offer cost-effective IT service with high levels of uptime. Servers are designed to endure failures without disrupting services to users.
  3. Open. Windows Server 2012 enables business-critical applications and enhances support for open standards, open source applications, and various development languages.
  4. Flexible. Windows Server 2012 enables symmetrical or hybrid applications across the data center and the cloud. You can build applications that use distributed and temporally decoupled components.

From just $50.00/month, you can get a dedicated cloud server with the following specifications:

- Windows 2008 R2 / Windows 2012 Standard Edition OS
- 1 Core Processor
- 1 GB RAM
- 100 GB SAN Storage
- 1000 GB Bandwidth
- 100 Mbps Connection Speed
- Full 24/7 RDP Access
- Full 24/7 Windows Firewall Protection
- Choice of United States, European (Amsterdam), and Asia (Singapore) Data Center

“As a leader in Windows Hosting, and having worked closely with Microsoft to reach this point, we’re moving forward with expectant anticipation for serving our loyal clients with Windows Cloud Server Solutions.” Said Dean Thomas, Manager at ASPHostPortal.com.

For more details, please visit http://www.asphostportal.com/Windows-Cloud-Server-Hosting-Plans.aspx.

About ASPHostPortal.com:


ASPHostPortal.com is a hosting company that best support in Windows and ASP.NET-based hosting. Services include shared hosting, reseller hosting, and sharepoint hosting, with specialty in ASP.NET, SQL Server, and architecting highly scalable solutions. As a leading small to mid-sized business web hosting provider, ASPHostPortal strive to offer the most technologically advanced hosting solutions available to all customers across the world. Security, reliability, and performance are at the core of hosting operations to ensure each site and/or application hosted is highly secured and performs at optimum level.



ASP.NET MVC 4 Hosting - ASPHostPortal :: How to Customize oAuth Login UI in ASP.NET 4.5 and MVC 4

clock May 3, 2013 08:32 by author Jervis

In this quick post, we will see how to customize, oAuth Login UI in ASP.NET 4.5 and MVC 4.

One of the common requirements with oAuth login is that displaying respective provider’s logo or image.

In ASP.NET 4.5 and MVC 4, we can register oAuth provider in App_Start/AuthConfig.cs file and here we can also pass additional data to oAuth provider if any. We can use this extra data dictionary to pass oAuth provider’s image url to view and based on it we can display image for respective provider.

As we can see in above code snippet that we are passing Icon url with extra data. So we can now access this Icon url from view. Open Views/Account/_ExternalLoginsListPartial.cshtml and replace the classic button markup with below code snippet.

Now run the application by pressing F5 and wow we have brand new login UI for oAuth provider.

ASP.NET MVC 4 Hosting start from $3.00/month. Check it out for more information!!

 



ASP.NET MVC 4 Hosting - ASPHostPortal :: Multiple Views and DisplayMode Providers in ASP.NET MVC 4

clock April 8, 2013 12:35 by author Jervis

All in all, the biggest difference between ASP.NET MVC and ASP.NET Web Forms is the neat separation that exists in ASP.NET MVC between the actions that follows a request and the generation of the subsequent response for the browser.

The request lifecycle In Web Forms was a continuous flow. Firstly, a Page object was created from default settings hard-coded in the ASPX file and then initialized to the last known good state read from the viewstate field. The user code had then a chance to further update the state of the Page object before the postback event was handled and the state of the page to render back to the user was prepared.

All this happened in a single procedure: There was little chance for developers to serve different views in front of the same request. On the other hand, Web Forms is built around the concept of a “page”. If you request a page, you’re going to get “that” page. Subsequently, if you request default.aspx from a site intended for desktop use why should you be expecting to receive a mobile optimized page instead if you’re making the request from a mobile device? If you want a mobile page, you just set up a new mobile site and make it reachable through a different URL. At that point, you have a distinct “page” to invoke and it all works as expected.

Web Forms at some point was also extended with convention-based tricks to automatically serve different master pages to different browsers and also to make server controls capable of returning different values for different browsers. Nevertheless, Web Forms serves the vision of the web world that was mainstream more than a decade ago. Unless you have serious backward compatibility reasons, you should definitely consider making the step forward to ASP.NET MVC; and use ASP.NET MVC for any new development.

Anyway, this article is NOT about the endless debate the relative merits of Web Forms and MVC—there’s really nothing to discuss there. This article is about new features introduced in ASP.NET MVC 4 to make it really easy and effective to serve different views in front of the same request.

Display Modes

Here’s the classic example where display modes fit in. Suppose you have a Home controller with an Index method.

public class HomeController : Controller
{
    public ActionResult Index()
    {
        return View();
    }
}

As you know, you should also have a file named index.cshtml located under the Views/Home folder in the project. This file will provide the HTML for the browser. In the body of the Index method above you code (or better, you invoke from other components) the logic required to serve the request. If, by executing this piece of logic, data is produced which needs to be embedded in the view, then you pass this data down to the view object by adding an argument to the View method.

public class HomeController : Controller
{
    public ActionResult Index()
    {
        var model = ProcessRequestAndGetData();
        return View(model);
    }
}

So far so good.

Now in ASP.NET MVC 4 there’s an extra piece of magic that you might not know about yet. To experience the thrill of it, you add a new file to the Views/Home folder named index.mobile.cshtml. You can give this file any content you like; just make sure the content is different from the aforementioned index.cshtml.

Now launch the sample site and visit it with both a regular desktop browser, Internet Explorer perhaps, and a mobile browser. You can use the Windows Phone emulator or perhaps Opera Emulator. However, the simplest thing you can do to test the feature without much pain is to hit F12 and bring up the IE Developer’s Tools window. From there, you set a fake user agent that matches a mobile device. If you are clueless about what to enter, here’s a suggestion that matches an iPhone running iPhone OS 6:

Mozilla/5.0 (iPhone; CPU iPhone OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko)

Quite surprisingly, the view you get for the same home/index URL is the mobile view as coded in the file index.mobile.cshtml.

What the heck is going on? Is this pure magic?

Even though I’m a firm believer that there can’t be any magic in software, well, I faced some terrible doubts until I found out about display modes.

Display Modes: Where Are Them?

To spot where display modes are and the role they play, I then used .NET Reflector to statically track the code path starting with the View method on the Controller class. From the View method, the code flow reaches the selected view engine—the RazorViewEngineclass in all cases in which CSHTML views are used. In ASP.NET MVC 4 all standard view engines inherit from the same base class—VirtualPathProviderViewEngine. This class has a new protected property named DisplayModeProvider. This property is of typeDisplayModeProvider. TheVirtualPathProviderViewEngine lists some helper methods through which the view name is resolved. The view engine receives the view name as set at the controller level: it can be name like “index” or it can be the empty string, as in the example above. If no view name is provided the view engine assumes it is the name of the action.

In ASP.NET MVC 4, an extra step takes place in theVirtualPathProviderViewEngine base class from which both WebFormsViewEngine and RazorViewEngine inherit. During the resolution of the view name, the view engine queries theDisplayModeProvider object to see if any of the registered display modes can be applied to the requested view. If a match is found, then the original view name is changed to point to the CSHTML file that represents the match. So, for instance, it may happen that “index” becomes “index.mobile”.

Let’s now explore further the internals of the DisplayModeProvider class.

The DisplayModeProvider Class

The documentation refers to this class as being internal to the framework; however, it has a few public members that you might, and should, be using in order to extend your site with multiple ad hoc views. The class has a static constructor that .NET Reflector decompiles as below:

static DisplayModeProvider()
{
    MobileDisplayModeId = "Mobile";
    DefaultDisplayModeId = string.Empty;
    _displayModeKey = new object();
    _instance = new DisplayModeProvider();
}

And here’s the constructor instead:

internal DisplayModeProvider()
{
    List list = new List();
    DefaultDisplayMode mode = new DefaultDisplayMode(MobileDisplayModeId) {
        ContextCondition = context => context.GetOverriddenBrowser().IsMobileDevice
    };
    list.Add(mode);
    list.Add(new DefaultDisplayMode());
    this._displayModes = list;
}

It turns out that DisplayModeProvider holds a list of DefaultDisplayMode objects each representing a display mode. By default, the provider holds two display modes: default and mobile. The default display mode is characterized by the empty string; the mobile display mode is characterized by the “mobile” string. These strings basically identify the suffix appended to the view name. This is where file name index.mobile.cshtml comes from.

It is interesting to focus on the following code:

DefaultDisplayMode mode = new DefaultDisplayMode(MobileDisplayModeId) {
     ContextCondition = context =>
context.GetOverriddenBrowser().IsMobileDevice
};

In spite of a misleading name, the DefaultDisplayMode class is just the official class that represents a display mode. As I see things, the “Default” prefix in the name is just out of place. A display mode class is built around two main pieces of information: suffix name and matching rule. In the code snippet above, a new display mode class is created with the suffix of “mobile”—the actual value of the MobileDisplayModeIdfield—and a matching rule assigned to the ContextConditionproperty. Property ContextCondition is a delegate as below:

Func<HttpContextBase, Boolean>

The purpose of the delegate is to analyze the HTTP context of the current request and return a Boolean answer to the question: should this display mode be used to serve the current request? How the Boolean response is found is entirely up to the implementation. As defined above, the mobile display mode parses the user agent string that comes with the request and seeks to find known keywords that would mark it for that of a mobile device. I’ll return on this point in a moment.

Listing Current Display Modes

You hardly have the need to do this in code, but I encourage you to try that out for pure fun. Here’s the code that reads and displays the currently available modes:

<ul>
    @{
        foreach(var d in DisplayModeProvider.Instance.Modes)
        {
            <li>@(String.IsNullOrEmpty(d.DisplayModeId) ?"default" :d.DisplayModeId)</li>
        }
    }
</ul>

You use the Instance static member to access the singleton instance of the DisplayModeProvider class and flip through the Modes property. By the way, the getter of the Modes property just returns the value stored in the internal _displayModes field dissected earlier through .NET Reflector.

Beyond the Default Configuration

The default and mobile display modes come free out of the box, but honestly they are not worth the cost. I have two reasons to say this. First, modern web sites need more than just a mobile/desktop dichotomy for views. You might want to distinguish tablets, smartphones, legacy phones, perhaps smart TVs. Sometimes this can be achieved with CSS media queries; sometimes you need to do server-side detection of the device via its provided user agent string. This leads to the second reason I have to blissfully ignore the default ASP.NET MVC configuration. Even if a plain desktop/mobile dichotomy works for your site, the point is that the logic behind the mobile context condition is weak and flaky. It has good chances to work with iPhone and BlackBerry devices; it may not even work with Windows Phone and Android devices—let alone with older and simpler devices. The method IsMobileDevice you have seen referenced a while back does sniffing of the user agent string based on the information it can find in the following .browser files you get installed with ASP.NET.

The model is clearly extensible and you can add more information at any time; but writing a .browser file may not be easy and the burden of testing, checking, and further extending the database is entirely on your shoulders.

The figure proves that I added a fairly large (18 MB) browser file—an XML file actually—named mobile.browser. That file comes from an old Microsoft project now discontinued and contains a reasonable amount of devices as of summer of 2011. All devices and browsers which came next are not correctly detected.

In the end, display modes are an excellent piece of infrastructure but require a bit of work on your end for configuration and additional tools to do view routing work effectively. The siren call about ASP.NET MVC being fully mobile aware is just a siren call.

What Can You Do About It?

You use display modes to give your site multiple views in front of the same URL. More concretely, this mostly means using defining a display mode for each device, or class of devices, you’re interested in. You could create an iPhone display mode for example. Likewise, you could create a tablet display mode. Here’s some code:

var modeTablet = new DefaultDisplayMode("tablet")

{
   ContextCondition = (c => IsTablet(c.Request))
};var modeDesktop = new DefaultDisplayMode("")
{
   ContextCondition = (c => return true)
};
displayModes.Clear();
displayModes.Add(modeTablet);
displayModes.Add(modeDesktop);

When run from Application_Start, the code drops default modes and defines two new modes: tablet and desktop. The tablet mode is added first and will be checked first. The internal logic that finds the appropriate display mode, in fact, stops at first match. If the HTTP request is not matched to a tablet, it is then treated by default with a view optimized for a desktop device.

How would you reliably determine whether a given request comes from a tablet? It’s all about sniffing the user agent string; but you need a professional tool for that. The answer is getting a commercial license from a Device Description Repository vendor like ScientiaMobile for WURFL. Note that WURFL is only the most widely used (Facebook and Google use it) device database; it is free for open source projects and has a partly free cloud version. Other products exist too. But my point here is that you should not spend a second on crafting your own solution for sniffing user agent strings.

 



ASP.NET MVC 4 Hosting - Overriding Browser Capabilities in MCV 4

clock March 4, 2013 09:31 by author andy_yo

The new System.Web.WebPages 2.0.0.0 assembly that ships with the latest MVC4 contains a pretty cool feature that lets you override the current browser capabilities. Sure, most modern browsers let you set a custom user agent string out of the box or via extensions. However, there are certain scenarios, where you would want to switch the user agent on the server side. That’s where the BrowserHelpers class comes in handy.

Override Browser Context

A good example where you want to use override the browser capabilities is when developing mobile views. You may not want to simulate a particular device, you just want to tell ASP.NET that the client is a mobile device and to use the .mobile view.  You can call SetOverridenBrowser extension method and pass in BrowserOverride enum (Mobile/Desktop options).

public ActionResult Mobile()
{
    HttpContext.SetOverriddenBrowser(BrowserOverride.Mobile);
    return RedirectToAction("Index");
}

If you want, you can override the browser full UserAgent by calling SetOverridenBrowser extension method on HttpContextBase

public ActionResult Iphone()
{
  HttpContext.SetOverriddenBrowser("Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_0 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8A293 Safari/6531.22.7");
    return RedirectToAction("Index");
}

And then, in order the clear the override, simple call the ClearOverridenBrowser extension method

public ActionResult Clear()
{
    HttpContext.ClearOverriddenBrowser();
    return RedirectToAction("Index");
}

What is happening under the hood

When you call the SetOverridenBrower method, ASP.NET sets a “.ASPXBrowserOverride” cookie. This is done using CookieBrowserOverrideStore from System.Web.Webpages, which implements BrowserOverrideStore – if you’re interested, check it out in dotpeek.

The value of the cookie is the user agent that you have set or in the case of the BrowserOverride.Mobile enum: Mozilla/4.0 (compatible; MSIE 6.0; Windows CE; IEMobile 8.12; MSIEMobile 6.0). The expiry date is set for 7 days so the override will be in place even if you re-open your browser. Calling ClearOverridenBrowser simply clears the cookie.

Create Mobile Switched Filter

The jQuery.Mobile.MVC package comes with the ViewSwitcher razor partial and the ViewSwitcherController. This does more or less exactly what I described above. However, if you are lazy like me, you may want to switch between mobile/desktop views using QueryString rather than controller/actions.  This is useful when you want to just quickly check your mobile views.

public class BrowserCapabilitiesSwitcherFilter : ActionFilterAttribute
{
    public override void OnActionExecuting(ActionExecutingContext filterContext)
    {
        var switchParameter = filterContext.RequestContext.HttpContext.Request.QueryString["switch"];
        if(string.IsNullOrEmpty(switchParameter))
            return;
        var browserOverride = BrowserOverride.Desktop;
        if(Enum.TryParse(switchParameter, true, out browserOverride))
        {
            //switch between BrowserOverride.Desktop / BrowserOverride.Mobile
            filterContext.RequestContext.HttpContext.SetOverriddenBrowser(browserOverride);
        }
        else
        {
            //set the user-agent string
            filterContext.RequestContext.HttpContext.SetOverriddenBrowser(switchParameter);
        }           
    }
}

Simply use it by typing http://yoursite.com/page?switch=Mobile to preview in mobile and then http://yoursite.com/page?switch=Desktop to switch back. For the more adventurous, you can pass in the user agent directly http://yoursite.com/page?switch=UserAgentString

 

 



DotNetNuke Hosting - DotNetNuke versus Joomla

clock February 5, 2013 05:45 by author andy_yo

DotNetNuke and Joomla are two very popular CMS (Content Management Systems) based on different technologies. DotNetNuke is a content management system written in Visual Basic and based on the Asp.Net framework, whereas Joomla is based on the popular PHP framework. Each has its own advantages and disadvantages which we will discuss further in detail.

About ASPHostPortal.com

ASPHostPortal.com is Microsoft No #1 Recommended Windows and ASP.NET Spotlight Hosting Partner in United States. Microsoft presents this award to ASPHostPortal.com for ability to support the latest Microsoft and ASP.NET technology, such as: WebMatrix, WebDeploy, Visual Studio 2012, ASP.NET 4.5, ASP.NET MVC 4.0, Silverlight 5 and Visual Studio Lightswitch. Click here for more information

Core Functionality: DNN offers extensive core functionalities in front of which Joomla seems a little weak. DNN offers features like database replication, event management, photo gallery and built-in forum system. Joomla also offers various functionalities that are not available in any other PHP based CMS, such as load balancing and a trash bin to ensure that articles are not accidentlly deleted. However, Joomla still falls short of DNN in context of their core functionalities.

Customization and Extensions: Joomla compensates the lack of core functionalities by allowing extensive third party plugins and customization facilities. Joomla provides a core framework around which developers can develop any site with desired functionalities. There is a plugin available for everything in Joomla therefore extra functionality can be added and customized according to client's requirements. Moreover, vast number of templates are available on the internet from which developers can chose the suitable themes and customize them according to their needs.

However, the template designs in Joomla are often based upon similar layouts which usually end up in the development of too many similar looking websites with slight changes in the design and color. DNN however offers a high level of flexibility thus provides an opportunity to create unique websites.

Basic Technology: Both Joomla and DotNetNuke are built upon different technologies which makes it essential to take their technical differences into account. Joomla uses technologies like PHP and MySQL backend which are extensively used for web development and web application development. On the other hand, DotNetNuke uses technically superior Asp.net framework from Microsoft which is too expensive for regular web-hosting environment. However, small and medium businesses usually have servers running on Microsoft's IIS which eliminates the open source advantage of Joomla.

DotNetNuke is more useful and feasible for corporate and enterprise intranets which require to integrate with the existing systems that are usually built using the same technology. They require sophistication, flexibility and robustness which are offered by DNN and can also be easily afforded by enterprises. Whereas, Joomla is designed to provide quick, expandable and cost effective web presence.

Support: The DotNetNuke offers extensive support depending upon the edition you have. The basic version provides developers forum through which users can get assistance from various other developers active on the forum. The paid "professional" version offers unlimited online support whereas the "Elite" edition provides live telephonic support with response under 2 hours. Joomla does not provide paid support system but there are various third party organizations offering training and support for Joomla.

The paid versions of DotNetNuke, i.e. "professional" and "elite" version are extensively tested and verified officially which makes them a good choice for business applications that require stability and perfectness.

Ease of Usage: DotNetNuke allows for quick and easy content editing functionality whereas in Joomla users have to first sign into a different section on the site before they can make changes to content. You can easily change the position of modules in DNN by drag and drop functionality which makes it so easier. Apart from this both the frameworks offer similar ease of use via features such as built in macro languages, ability to mass upload and search engine friendly URLs.

While comparing them we just cannot declare one better than the other as each has some advantages and disadvantages of its own. What we can say with certainty is that DotNetNuke is better for business applications and creation of business scale websites whereas Joomla is ideal for making quick, functional and cost effective web sites.

 

 



About ASPHostPortal.com

We’re a company that works differently to most. Value is what we output and help our customers achieve, not how much money we put in the bank. It’s not because we are altruistic. It’s based on an even simpler principle. "Do good things, and good things will come to you".

Success for us is something that is continually experienced, not something that is reached. For us it is all about the experience – more than the journey. Life is a continual experience. We see the Internet as being an incredible amplifier to the experience of life for all of us. It can help humanity come together to explode in knowledge exploration and discussion. It is continual enlightenment of new ideas, experiences, and passions


Author Link


Corporate Address (Location)

ASPHostPortal
170 W 56th Street, Suite 121
New York, NY 10019
United States

Sign in