Windows Management and Scripting

A wealth of tutorials Windows Operating Systems SQL Server and Azure

  • Enter your email address to follow this blog and receive notifications of new posts by email.

    Join 721 other subscribers
  • SCCM Tools

  • Twitter Updates

  • Alin D

    Alin D

    I have over ten years experience of planning, implementation and support for large sized companies in multiple countries.

    View Full Profile →

Posts Tagged ‘web applications’

Learn the workflow reports in SharePoint 2010

Posted by Alin D on June 21, 2011

Workflow in both SharePoint 2007 and SharePoint 2010 can be a powerful feature to help automate business processes and enforce best practices. But, without proper oversight, workflows can get bogged down by end users, halted because of errors or simply ignored by workflow participants. To avoid these situations, administrators can rely on workflow reports and built-in monitoring within SharePoint.

To get started, you should know something about workflows in SharePoint. Out of the box, workflows can be tightly coupled with content types, and any time content of a particular type is contributed to a project, the workflow could be activated, regardless of where the content is contributed.

Alternatively, workflows can be activated by contributing content to a specific list or library. Finally, if configured to do so, end users — or content contributors — can activate a workflow manually. Because of the variety of methods used to initiate workflow processes, reporting on the status of those workflows can help administrators figure out what’s going on.

The first indication that a workflow has run or could run on content will be in the standard document library view. Along with the columns (metadata) in that library for any given view, there is a column for every workflow.

The column shows the status of the workflow running against the specific document. In this example, the Gears Sales History Excel workbook has had an approval workflow started and completed.

The last step in that process is an approval step, and the status is shown in the column corresponding to the workflow. If you click on the status, you get additional information related to that specific workflow instance.

he additional information on the workflow represents the outcome the last time that workflow ran on that document. The basic data tells when this specific instance of the workflow started and when it was last run.

It also shows you the status of the document. In this example, the last step was an approval, and the document was indeed approved. If additional information is required, you can look at the workflow history shown in next figure, which is a complete record of this instance of the workflow. It provides detailed information regarding the steps taken during the workflow, who was involved and — in this case — what errors occurred during processing. Although it’s unlikely that an administrator with farm-level visibility would access workflow history in this way, it does show what’s available and what’s tracked by SharePoint.

If you’re only interested in one document, this method works fine. In fact, it could be an excellent resource for power users who want to understand what’s happening with their documents. But for the site or farm administrators of the world, this method isn’t really that useful because the scope is too narrow.

Fortunately, there are facilities within SharePoint that can shed light on the broader view of workflows within a site collection. At a basic level, site administrators can view which workflows are associated with the site collection. In the Site Settings menu for a site collection, there’s a Workflows link in the Site Administration category. In next image you can see that this link takes you to the list of workflows currently associated with the site.

Although the interface has a somewhat limited use, it does give you an overview of every workflow and tells you whether it’s associated with content and whether or not it’s active. This way, SharePoint administrators can get a broad overview quickly. Unfortunately, none of the information on this interface is clickable, so drilling in more deeply is not possible.

When you need additional detail, you can drill down into workflow reports, shown at the list management level, through the Workflow Settings option located on the far right of the ribbon. Workflow settings tell you which workflows are associated with specific content within the list.

The last option on that interface is View Workflow Reports, which gives you the ability to report on the specific workflows associated within the list. The menu will display all of the currently associated workflows, giving you an option to report on activity and cancellations/errors.

If you select an individual report, you’ll be walked through a report creation wizard. The wizard immediately prompts you to specify the library where the report will be stored when completed.

In all cases, SharePoint dynamically creates an Excel workbook with both a pivot table report layout and the raw data. The pivot table is moderately useful for providing summary-level information but not much beyond that. But with the supplied raw data, you have the opportunity to craft your own reports and combine that data with data from other reports to create a composite picture of your workflow activity.

Like many other facets of SharePoint 2010, Microsoft tried hard to work in aspects of the Office products. In the context of workflow, the primary tool is Visio. If you’ve been paying attention to the new features, you know that Visio is now a first-class workflow creation tool, cooperatively with SharePoint Designer.

What you may not know is that the Web version of Visio, part of the Office Web applications, provides you with visualization capabilities. You also have the opportunity to “see” the workflow steps, those involved and basic iconography that can help give you insight on the process.

In the end, workflow can provide a great deal of value in most organizations. The challenge, historically, was getting enough insight into what’s actually going on under the covers. Through SharePoint 2010, Microsoft has given administrators some of the reporting and monitoring capabilities you need to keep tabs on processes. All of this combines to give you greater visibility and — more important — an ability to manage the increasingly complex world of your SharePoint environment.

 

Posted in TUTORIALS | Tagged: , , , , , , | Leave a Comment »

ASP.NET 4 web hosting on Windows 2008

Posted by Alin D on March 10, 2011

windows 2008

ASP.NET 4 web hosting on Windows 2008

Microsoft ASP.NET is a programming framework built on the common language and can be used to create anything from small, personal websites through to large, enterprise-class web applications with a minimum of coding.

The first versions of ASP.NET offered several important advantages over previous Web development models. The last release – ASP.NET 4.0 and Visual Studio 2010 include lots of new features and improvements that enable you to easily build, deploy and manage great Web sites and applications.

ASP.NET 4 Enhancements

The Microsoft .NET Framework 4 provides the following new features and improvements:

Improvements in Common Language Runtime and Base Class Library
– Performance improvement including better multicore support, background garbage collection, and profiler attach on server.
– New memory mapped file and numeric types.
– Easier debugging including dump debugging, Watson minidumps, mixed mode debugging for 64 bit and code contracts.
– For a comprehensive list of enhancements to CLR and BCL go here.
Improvements in Data Access and Modeling
– The Entity Framework enables developers to program against relational databases using .NET objects and Language Integrated Query (LINQ). It has many new features, including persistence ignorance and POCO support, foreign key associations, lazy loading, test-driven development support, functions in the model, and new LINQ operators. Additional features include better n-tier support with self-tracking entities, customizable code generation using T4 templates, model first development, an improved designer experience, better performance, and pluralization of entity sets.
– WCF Data Services is a component of the .NET Framework that enables you to create REST-based services and applications that use the Open Data Protocol (OData) to expose and consume data over the Web. WCF Data Services has many new features, including enhanced BLOB support, data binding, row count, feed customization, projections, and request pipeline improvements. Built-in integration with Microsoft Office 2010 now makes it possible to expose Microsoft Office SharePoint Server data as an OData feed and access that data feed by using the WCF Data Services client library. For more information go here
Enhancements to ASP NET web host
– More control over HTML, element IDs and custom CSS that make it much easier to create standards-compliant and SEO-friendly web forms.
– New dynamic data features including new query filters, entity templates, richer support for Entity Framework 4, and validation and templating features that can be easily applied to existing web forms.
– Web forms support for new AJAX library improvements including built-in support for content delivery networks (CDNs).
– For a comprehensive list of enhancements to ASP.NET go here.
Improvements in Windows Presentation Foundation (WPF)
– Added support for Windows 7 multi-touch, ribbon controls, and taskbar extensibility features.
– Added support for Surface 2.0 SDK.
– New line-of-business controls including charting control, smart edit, data grid, and others that improve the experience for developers who build data centric applications.
– Improvements in performance and scalability.
– Visual improvements in text clarity, layout pixel snapping, localization, and interoperability.
– For a comprehensive list of enhancements to WPF go here.
Improvements to Windows Workflow (WF)
-These include an improved activity programming model, an improved designer experience, a new flowchart modeling style, an expanded activity palette, workflow-rules integration, and new message correlation features. The .NET Framework 4 also offers significant performance gains for WF-based workflows.
Improvements to Windows Communication Foundation (WCF) such as support for WCF Workflow Services enabling workflow programs with messaging activities, correlation support. Additionally, .NET Framework 4 provides new WCF features such as service discovery, routing service, REST support, diagnostics, and performance.
Innovative new parallel-programming features such as parallel loop support, Task Parallel Library (TPL), Parallel LINQ (PLINQ), and coordination data structures which let developers harness the power of multi-core processors
ASP.NET Web Hosting

Intermedia provides ASP.NET web hosting on Windows 2008/IIS7 Server with .NET support. As the software web server MS Internet Informational Server is used with Microsoft .NET Framework 1.0, 1.1, 2.0, 3.0, 3.5. For dynamic pages creation Intermedia provides .NET web hosting services, ASP web hosting services, MSSQL hosting services. Choice of ASP.NET web hosting is defined by a choice of Microsoft development technologies: IIS, Windows server, MSSQL server as a platform of web applications development. Using ASP.NET web hosting allows you to work with high-efficiency MSSQL Server databases.

ASP.NET 3.5, .NET 3.0, and ASP.NET 2.0 are still available on our Windows 2008 Hosting and Windows 2003 Hosting platform. ASP.NET 1.1 Hosting is only available on our Windows 2003 Hosting platform.

Article from articlesbase.com

Posted in Windows 2008 | Tagged: , , , , , , , , , , , , , | Leave a Comment »

SharePoint 2010 Service Applications Architecture (SSA)

Posted by Alin D on December 9, 2010

If you have any experience in administering MOSS farms, you probably already know how isolated Shared Service Providers (SSPs) can be and how extremely difficult is to restore an SSP in case of disaster recovery. The good news is that for SharePoint 2010 SSPs are gone and we now we have more flexible services that can be shared even between farms – SharePoint Service Applications (SSA). Before we take an indepth look at SSA, we will need to take a look at the terminology Microsoft uses when describing the new Service Applications architecture. Core functionality Name Description Service A set of binaries installed on the server farm which are capable of some functionality. Service Application The Service described above  but configured for a specific SharePoint farm. Service Application Proxy The Proxy service is really a pointer to a Service Application within the farm that exists on the Web front-end. Service Instance The instance of the Service Application running on the application Server. Service Consumer A feature such as a Web Part which is using the Service Application to communicate with end-user and present the service results to the browser. Table representing new core-service definitions

SharePoint 2010 Architecture vs SharePoint 2007 Architecture

In the example architecture below you can see that we have two SSP’s in the farm, each has its own set of Services and applications. In this SharePoint 2007 environment you can’t share services between SSP’s and especially between farms. You have to setup new services for every SSP you create. Sharepoint 2007 ArchitectureOffice SharePoint Server 2007 Service Architecture In the SharePoint 2010 architecture below, you can see we do not have the SSP. It is simply not needed since we are now using Application Service instances for every web application we choose and the services can be shared between different web applications. In our example we are using the same User Profiles service for all web applications, and dedicated services for every application such as Search Service, BCS (BDC) Service, Access and Visio Services, etc. You may even share the Service Applications between SharePoint farms under Service Application Publishing. SharePoint SSA SharePoint Server 2010 Service Architecture Because of these architecture changes, it is now more complex to plan and design the SharePoint environments based on farms and services. In SharePoint 2007 we simply had to create an SSP and applications under it, with all the required services attached to the isolated SSP environment. Whilst SharePoint SSA is a major advance, not all services can be published (which means not all can be shared between SharePoint farms): Services that can be published Services that cannot be published Users and Profiles (People related applications) Usage and Health Services Metadata Services Site Services Business Connectivity Services (BCS) Project Services Search Services Excel Calculation Services Secure Store Services Access Web Services Web Analytics Services Visio Web Services Word Web Services Performance Point Services Service Application Publishing possibilities for SharePoint 2010 In general, services that are based on Web applications (Word, Visio, Access, and Excel) cannot be shared between farms but they still can be shared between the web applications so it is probably not a major issue. Now, the overall concept of the new SharePoint Services architecture is now clear, let’s go deeper and see how it works from administrative viewpoint. To use a Shared Service, you must first provide a Service Application (which is really the service instance) valid for the farm where it is deployed. Service application contains: • Admin Interface (for service management within the farm) • IIS Application Pool • Associated databases (may use more than one database, but may not use any depending on the service design) • Server Instance (the process or web service that is running physically on the server)

Worked Example using SharePoint SSA

We want to create a new service application instance for our farm – Visio Graphics Service, so I go ahead and choose this service from a list: SharePoint SSA Service Applications screen in Central Administration Next we have to provide the instance a name. Select the application pool (it is always good practice to create new one) and we need to decide if we want a Proxy attached to the Service instance. For this example, we want the default proxy added so we left this box checked (the default behavior). SharePoint SSA Visio graphics Service Application setup window SharePoint SSA Service Applications list with our newly created instance. Our newly created instance is now visible in the IIS Server. To identify our newly created (or any other) service instance, open IIS Manager and navigate to the “SharePoint Web Services” web application: SharePoint SSA IIS 7.5 SharePoint Web Services list Now a big minus – we only see a long GUID that isn’t readable. The simplest way to find our newly created service is to explore each GUID until we found our service name inside. SharePoint SSA Explorer window with the Visio Graphics Service files. Please note that the Microsoft has redesigned the services to be an SVC extension (WCF Web Services) instead of ASP .NET web services (ASMX extension).

Posted in TUTORIALS | Tagged: , , , , , , , , , | Leave a Comment »

How To Install an SSL Certificate using IIS 7

Posted by Alin D on November 4, 2010

To install an SSL in IIS , you first  need to issue a certificate for your web server. For this purpose you have to select the webserver root node in the navigation tree of the management console, and select the Server Certificates feature, as shown below:

After selecting Sever Certificates, the IIS management console lists all the server certificates installed on the web server (see below). The first thing to note is that  in IIS 7   you can install multiple server certificates on one web server, which can be used for multiple websites set up  on the web server (previous IIS versions allowed you to install only one server certificate per web server).

SSL Certificate IIS
In the Server Certificates feature details view in the IIS Management Console, the task pane on the right side  shows the necessary task(s) for installing server certificates. You can create a certificate request automatically that you can then use to requesting a new certificate at a CA. To create a new request, click the Create Certificate Request task link on the  pane,  this creates the same Base64-encoded request as  in previous versions of IIS. Use this Base64-encoded request file for submitting your request at the CA. After retrieving the certificate from the CA, you complete the running request by clicking the Complete Certificate Request  link. Thus you can both request and configure an SSL certificate for a standalone webserver. If you need to request an SSL  certificate for your own CA, use the Online Certification Authority wizard by clicking the Create Domain Certificate link. This certificate will then be configured in your own CA and will be used for signing certificates issued by this CA.

This process is quite laborious if you are a developer who just wants to test SSL with your own web apps. Therefore, IIS 7  ships with an additional option – creating a self-signed certificate for just your own machine. Just click the Create a Self-Signed Certificate link in the console and all you will need to specify  is a friendly name which will be displayed in the listing. The wizard creates a certificate by using the cryptographic functions of your local machine and automatically installs the certificate in your web server. It is important to note that these certificates should only be used for development and testing purposes, since only your browser running on your local machine will know the certificate, and therefore will show warnings that the certificate is invalid.

Once you have configured and installed the SSL  certificates, you can leverage these  for SSL-based communication in the sites configured on your IIS. To do this  you need to configure the protocol bindings for SSL, as well as the SSL options for any web apps within the web-sites.

Configuring Bindings for SSL in IIS

Bindings are used for making the content of websites available using specific protocols, IP addresses, and ports. In addition, the host headers for accessing multiple web apps through the same IP address and port are also configured in the bindings. To use SSL for apps configured within a website, you will need to configure a protocol binding for SSL for that website. To do this,  select your website in the navigation tree of the IIS Management Console and then select the Bindings link from the right hand side task pane. A dialog will appear which allows you to configure the bindings. Here, you  add the new bindings to make the contents available through different IP addresses, protocols,  and ports as shown below.  Click  Add to add new a binding to the website, and  Edit button to  modify existing bindings in the list.

SSL Certificate IIS

As you can see from the below screenshot, the protocol has been configured to https running on the default IP address for the server, and uses port 443 for SSL-based access (the default port for SSL). In addition, in the dropdown list you can select the certificate to be  used for SSL traffic on the website. Each certificate which you installed previously is available for selection in this listing, and you can set up different certificates for different websites on the server. After you have configured the SSL binding for your website, you can enable SSL for web applications within the website.

Encoding Information with SSL

SSL is enabled and configured for each individual site/app in IIS. Once you have configured the bindings at the website level, you can select the web app of your choice in the nav tree of the IIS Management Console and then activate the SSL   configuration as shown below:

SSL Certificate IIS
You can specify the requirement for SSL encoding for the chosen web app and whether to require client certificates to authenticate users. If you are using client certificate authentication you will need to configure the certificate mappings from certificates to the users that are eventually authenticated by IIS when retrieving the certificate. Configure these mappings in the web.config file.

Posted in TUTORIALS | Tagged: , , , , , , , , , , , | Leave a Comment »

How to Migrate Content in SharePoint 2010

Posted by Alin D on November 1, 2010

In this tutorial I will describe  how to migrate  content between SharePoint site collections and even entire farms. Very often you need to clone, copy or just create testing environment out of your production sites so the good migration procedures are essential to every SharePoint administrator. The best tool for content migration is the export / import functionality.

For the purposes of this tutorial I  created a Site Collection from the Team Site template called IT Department, I then created the Technical Support site within the site collection. Now, to migrate the Technical Support site to a different site collection.

SharePoint Content Migration

IT Department site with Technical Support sub site visible in the top-menu.

Using stsadm, you can move the site using the export/import commands. Let’s try and export the site, using the stsadm tool. If you want, you can provide additional parameters, which can be very important to the
end result of the migration. To see the options type in “stsadm –o export” without additional parameters.

SharePoint Content Migration

stsadm –o export command parameters visible in the cmd.exe window

One of the mostly used oprions is includeusersecurity and versions. If you add the includeusersecurity parameter, the site will be exported with the security asssignments. If you don’t want to lose the security settings after the migration, you should add this parameter to the command. The versions parameter is used when you have versioning enabled and you don’t want the default content migration, which will export only the last major
version. Using the default settings you may find that some documents are missing from the migration, if they had no major versions at all (only minor versions, which is often the case). I would suggest to using the versions 2 parameter if you only want the latest document versions migrated. now we can revise the command and include security and the latest document version in the script:

stsadm –o export –url http://sp2010/techsupport -filename C:techsupportsite -includeusersecurity -versions 2

When this command is executed,  stsadm will start the logging to the cmd window and the log file in the exported file directory. If all the content is successfully exported, you will see the end result with no warnings or errors. If  encounter any warnings or errors, you can use the export log file to find and fix the issues before migrating  the site.

SharePoint Content Migration

End of the Export command with the successful report

Now, we will use the import command to restore the exported site on a different site collection. You can also use this file to import the site with content on different web applications, or even different farms, but then you will have to provide the same set of site collection features, web parts and security accounts.

In SharePoint 2010, you can import to a URL that doesn’t yet exist and you do not have to create the empty site just to import the content later.

In the command line, enter the below command:

stsadm –o import –url http://sps2010/sites/newsitecollection/techsupport –filename techsupportsite.cmp –includeusersecurity

After a successful import we can now check the migrated site to identify any issues. In this example there were no issues with the sample content but it this will obviously depend on the migrated content. If you are using third party SharePoint objects, you should deploy them on new application prior to using  the import command.

SharePoint Content Migration

Site migrated to new site collection using stsadm tool.

In SharePoint 2010 we can also use PowerShell for content migration.  With the new PowerShell capability, we can export/import a site, list or document library.

To do this, open the SharePoint 2010 Management Shell from  Start – All Programs:

SharePoint Content Migration

SharePoint 2010 Management Shell location in the Start Menu

In the Windows PowerShell window, I am going to use the Export-SPWeb command. Export-SPWeb is capable of exporting a site collection, web application, list or library. Using stsadm (which was the only way in SharePoint 2007) we are not able to export just a list or library since  the lowest level is the site.

Let’s look at an example which corresponds to the stsadm we used before:

Export-SPWeb –Identity http://sps2010/techsupport -Path C:techsupportPS -IncludeUserSecurity -IncludeVersions CurrentVersion

The  syntax here is very similar but there are  additional parameters  that can be set using this command. Some commands are also extended or work differently than their corresponding stsadm commands. Below is a description of the more important cmdlets:

  • Identity: Specifies the URL of the web application you wish to export. You can also use the application GUID instead of the URL.
  • Path: Specifies the location and name of the exported file. If you use NoFileCompression flag, you will have to specify a directory in the Path string parameter.
  • Confirm: Prompts for user permission before executing script. Useful when scripting sets of management commands that require user input.
  • ItemURL: Specifies the URL of the SharePoint object you want to export. In this parameter you can specify the lower objects, like lists and document libraries.
  • UseSqlSnapshot: This will tell the export mechanism to use the direct database snapshot to collect the data, instead of using the default SharePoint web-based mechanisms.
  • WhatIf: Use this parameter when you only want to check what would happen if you attempted to export a  SharePoint object. It will not run any code, but only show  the message and describe the effect of running the cmdlet.

As with the stsadm, where there’s an export – there is also an import. To use the exported cmdlet file, use the Import-SPWeb cmdlet.

Example usage:

Import-SPWeb http://sps2010/sites/newsitecollection/techsupport -Path C:techsupportPS.cmp –Overwrite

With the PowerShell Export-SPWeb and Import-SPWeb you can migrate sites, site collections, lists and document libraries between entirely different SharePoint 2010 farms.

That’s about it in terms of SharePoint content migration. You may think I didn’t show you the Central Administration and Unattached Database Recovery functionality… but why? Because it’s simply the export/import we’ve just learned, but with the  GUI added to the web site. Of course using PowerShell gives us much, much more than using Central Administration.

Posted in TUTORIALS | Tagged: , , , , , , , , , , , , , , , , , | Leave a Comment »

SharePoint Performance Tuning

Posted by Alin D on November 1, 2010

I was recently looking for the SharePoint 2010 performance articles on Microsoft pages and established blogs, and found that most of them weren’t covering all the details. Some of them simply described MS SQL-based tips, , some were straight system related, and it was extremely hard to find useful SharePoint-based performance tips. I’ve decided to try to fix this hole and provide you all the SharePoint performance steps and details I know in one place .

SharePoint Hardware Planning

Before you even start thinking about improving your performance, keep in mind that even best tips won’t help you if your hardware is simply too weak to handle SharePoint environment.

This article is not intended to explain how to plan your hardware environment, but the only detail that is worth mentioning is that you should know the future details of your SharePoint farm BEFORE you buy the hardware, such as:

  • Total number of SharePoint farm users
  • Simultaneous number of SharePoint farm users
  • Services that would be provided (Search, FAST Search Server, Office Web Access, Visio Services etc. may decrease performance so you probably need to provide dedicated hardware for this)
  • Amount of data that will be stored and processed by the SharePoint farm on a  daily/weekly/monthly basis.

Knowing the above, you can probably design your infrastructure successfully and be happy with the performance of your SharePoint farm after the deployment.

Note: a useful tool to plan your infrastructure (if you know the above details) is HP Sizer for Microsoft SharePoint, which can be accessed athttp://h20338.www2.hp.com/activeanswers/Secure/548230-0-0-0-121.html .

SharePoint Front End Caching

With the SharePoint Server 2010ships with strong caching capabilities, like BLOB (BinaryLarge OBject) cache, profiled cache and object cache. We’ll start with the BLOB.

BLOB cache is disk-based caching that highly increases browser performance and reduces database loads, since the SharePoint reads cached content from BLOB files, instead of databases.  When you open a web page for first time, the files will be copied from the database to the cache on the hard drive on SharePoint server and then all subsequent  requests to this site will be accessed from the local hard drive cache instead of issuing a resource intensive request to the SQL Server database.

To enable the BLOB cache for a Web Application of your choice, you need to edit web.config file. Access your IIS Manager on the Front-End Server where your web application is, and use Explore option to find where it is located on the hard drive (usually C:inetpubwwwrootwss…).

SharePoint Performance

IIS Manager Explore option for application SharePoint – 80

Next, open the web.config file with your favorite text editor (notepad will be sufficient for this).

SharePoint Performance

Web.config file in the application root directory

Now, find the line starting with:

<BlobCache location=

and set the properties correctly. We need to set the cache directory and change the “enable” attribute to “true”. It is strongly recommended to store the cache on a dedicated partition, which isn’t a part of the operating system (C: partition is not recommended). This is why I’ve stored my cache on D: partition.

<BlobCache location="D:BlobCache14" path=".(gif|jpg|jpeg|jpe|jfif|bmp|dib|tif|tiff|ico|png|wdp|hdp|css|js|asf|avi|flv|m4v|
mov|mp3|mp4|mpeg|mpg|rm|rmvb|wma|wmv)$" maxSize="10" enabled="true" />

In the path attribute, you can add or remove file extensions that will be cached. The maxSizeis used for changing the maximum size of the cache on your hard drive in gigabytes (GB), the default maximum size is 10GB.

To configure cache profiles, we will also use web.config file. This will allow us to override the user interface cache profile settings, so we have full control over the process. To use the cache profiles, site collections must have the publishing feature enabled first.

To enable cache profiles, find the line in web.config:

<OutputCacheProfiles

and set the attributes of this tag appropriately:

useCacheProfileOverrides=”false” : change this to “true” to enable overriding the cache profile settings.

Next three attributes (varyByHeader, varyByParam and varyByCustom) define custom parameters in the .NET Framework Class library – we don’t need to change these so the default settings are fine. The varyByRights attribute removes the requirement for identical effective permissions on all securable objects within the cached pages of any other users. Change this value to “false”.

The cacheForEditRights attribute bypasses the default behavior of the page caching per user. Change this attribute to “true”.

The final result of the modified output cache profiles line in web.config should be similar to this:

<OutputCacheProfiles useCacheProfileOverrides="true" varyByHeader="" varyByParam="*"
 varyByCustom="" varyByRights="false" cacheForEditRights="true" />

Next we need to configure the  Object Cache. Object cache settings can be altered at the site collection level using the user interface and this cache is enabled by default. The maximum size of this cache can be configured on the web application level on the Web-Front-End servers (as with  the cache profiles). To use the object cache, the site collections must have publishing feature enabled.

To change Object cache settings, open the  web.config file of our application find the line:

<ObjectCache maxSize

The default value for the maxSize attribute is 100, which means 100 megabytes (MB) will be used for entire web application for object caching. You should modify this value to  use most of your physical memory on the front-end server. If you see that a server consistently has more than 30% available memory, you can improve the site performance by increasing the maxSize attribute.

That’s all about the SharePoint caching options, which are mostly configured in the web.config file. Now, when we have BLOB cache enabled, cache profiles and object cache tweaked to fully use our hardware we can move on to tweaking the performance SharePoint authentication which will be the focus of part 2 of the SharePoint performance series.

Enabling Kerberos Authentication

If your sites are serving numerous requests at a time, and you are experiencing a slow page load, you should consider switching the site-level authentication from NTLM to Kerberos. Whilst NTLM is good for small or medium sized sites, Kerberos is useful when your environment requires high workload and needs to process a large number of requests. Using NTLM, authentication requests aren’t cached and they need to go to the domain controller every time a request is made to an object which is a performance drag. With Kerberos authentication,  requests can be cached, so the process won’t have to communicate with the domain controller to retrieve the object from the site this can dramatically improve SharePoint performance.

To enable Kerberos authentication for your web application, we’ll have to specify the application pool identity and then create a new SPN using the setspn.exe tool.

Go to the IIS Manager on the web server server, and select the website where you want to enable Kerberos authentication (1), using the left pane. Then go into the Authentication Icon, select Windows Authentication (2) (which should be enabled) and click on Advanced Settings (3). You need to make sure that the “Enable Kernel-mode authentication” option is checked (4), checking this option will perform an IIS Reset before resuming.

SharePoint performance

Enabling Kernel Mode Authentication in IIS Manager

Next, we need to run appcmd and set the useAppPoolCredentials attribute to true for our web application (SharePoint – 80). You need to run cmd console in administrator mode if your server has User Account Control enabled. The appcmd tool can be accessed fromC:WindowsSystem32inetsrv folder.

Now, execute a command:

Appcmd set config “SharePoint – 80” /section:windowsauthentication /useAppPoolCredentials:true /commit:MACHINE/WEBROOT/APPHOST

SharePoint performance

CMD console with appcmd command

Now we need to check if the application host configuration is properly configured in order to continue with Kerberos authentication setup. OpenC:WindowsSystem32inetsrvconfigapplicationHost.config and check if our application (SharePoint – 80) has the proper attributes set in the system.webServer section.

My entire SharePoint – 80 entry in the applicationHost.config file is below:

<location path="SharePoint - 80">

<system.webServer>

<handlers accessPolicy="Read, Execute, Script" />

<security>

<authentication>

<windowsAuthentication enabled="true" useKernelMode="true" useAppPoolCredentials="true">

<providers>

<clear />

<add value="NTLM" />

</providers>

<extendedProtection tokenChecking="None" />

</windowsAuthentication>

<anonymousAuthentication enabled="false" />

<digestAuthentication enabled="false" />

<basicAuthentication enabled="false" />

</authentication>

</security>

<urlCompression doStaticCompression="true" doDynamicCompression="true" />

<httpErrors existingResponse="PassThrough" />

<httpProtocol>

<customHeaders>

<clear />

<add value="ASP.NET" />

<add name="MicrosoftSharePointTeamServices" value="14.0.0.4762" />

</customHeaders>

</httpProtocol>

</system.webServer>

</location>



Please note the attributes   bolded above  are the attributes we’ve just set which are required for Kerberos authentication to work properly.

Now perform IISReset /noforce command to reload the changes on the web server. We have only one step left on the backend configuration of Kerberos – we need to set SPN, which is required to map the service and host name to our custom application pool account.

On the Web-Front End server open command prompt with administrative privileges, and execute the command:

Setspn –A http://SiteURL domainapplication_pool_account

It is very important to type in the valid application URL and the domain account that is the identity of the application pool of the site. If you are unsure what the application pool identity is, go to IIS Manager, select Application Pools section in the left pane, and read the account that is running on your application pool (SharePoint – 80 in this example)

SharePoint performance

Application Pools view in IIS Manager

As you can see in our example, the SharePoint – 80 application pool is using account chaosspsadmin, so the command in my environment will be like:

Setspn –A http://sps2010 chaosspsadmin

Now, we should enable the trust for delegation for this account. To do this, go to the Domain Controller and launch Active Directory Users and Computers console, then locate the account (in our example it is chaosspsadmin account) and in the properties of the account, select the Delegation tab and then select “Trust this user for delegation to any service (Kerberos Only)” option.

Note, that you won’t see the Delegation tab if you have missed a step or made a mistake during the configuration using setspn command for the application pool identitity.

Now the last Kerberos step – we need to enable Kerberos on the Web Application itself. To do this, launch Central Administration, select Application Management – Manage Web Applications, and mark our web application (SharePoint – 80). You should now see in the ribbon the Authentication Providers icon – click on it.

SharePoint Performance 2

Central Administration – Authentication Providers icon in the ribbon

Select the correct zone for your web application where we’ll be enabling Kerberos authentication (by default it is Default zone) and in the IIS Authentication settings change the radio button from NTLM to Negotiate (Kerberos).

SharePoint Performance 2

Authentication for the application changed from NTLM to Kerberos

We’ve spent quite some time configuring Kerberos, but believe me – it is worth the time consumed, especially in larger environments, where you’ll probably need to tweak performance ratings in the first place.

Application Pool Recycling.

There’s not so much to configure, but a lot to explain in this section. It is very important to tweak the application pool recycling to suit your farm infrastructure and server architecture. It is best to recycle the pools at night, when your sites has the lowest user traffic. If you have multiple load balancing servers, it’s strongly recommended to turn of the recycling server from the Load Balancer, or you’ll experience poor performance during the process. Since SharePoint Server 2010, which requires 64-bit environment, you can forget about maximum memory based limits since this is managed by the IIS Server itself.

SharePoint Performance 2

Application Pool recycling settings

Checked Out Pages

If your sites are using Enterprise Content Management and Check-In/Check-Out functionality, you should never leave sites checked out, because this decreases the page rendering performance visibly to the users. Instead, check them in as quickly as possible to avoid slower performance.

Now we have looked at most of the front-end SharePoint performance settings. In Part 3 we will look at some of the back-end performance tuning.

Related Articles:

Posted in TUTORIALS | Tagged: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a Comment »

Windows 2008 Server Role Servers Explained

Posted by Alin D on October 7, 2010

A server on a network – standalone or member – can function in a number of roles. As the needs of your computing environment change, you may want to change the role of a server. By using the Server Manager and the Add Roles Wizard, you can install Active Directory Domain Servers to promote a member server to a domain controller, or you can install individual roles or combinations of various roles, such as DHCP, WINS, and DNS.

It is also relatively straightforward to demote a domain controller to a simple role server or remove any number of roles and features from a server.

Server Manager is the key configuration console you will use for installing server roles and features on your server. It can be configured to open automatically as soon as you log in to the
Windows console or desktop.

Types of roles

Let’s look at the various roles and features you can install on Windows Server 2008.

Active Directory Certificate Services (AD CS)
AD CS role services install on a number of operating systems, including Windows Server 2008, Windows Server 2003, and Windows 2000 Server. Naturally the fullest implementation of AD CS is only possible on Windows Server 2008. You can deploy AD CS as a single standalone certification authority (CA), or you can deploy multiple servers and configure them as root, policy, and certificate issuing authorities. You also have a variety of Online Responder configuration possibilities.

Active Directory Domain Services (AD DS)
This is the role in the Windows Server 2008 operating system that stores information about users, computers, and other resources on a network. AD DS is also used for directory-enabled applications such as Microsoft Exchange Server.

Active Directory Federation Services (AD FS)
AD FS employs technology that allows users over the life of a single online session to securely share digital identity and entitlement rights, or ‘”claims” across security and enterprise boundaries. This role – introduced and supported on all operating systems since Microsoft Windows Server 2003 R2 – provides Web Single Sign-On (SSO) services to allow a user to access
multiple, related Web applications.

Active Directory Lightweight Directory Services (AD LDS)
This service is ideal if you are required to support directory-enabled applications. AD LDS is a Lightweight Directory Access Protocol (LDAP) compliant directory service.

Active Directory Rights Management Services (AD RMS)
This service augments an organization’s security strategy by protecting information through persistent usage policies. The key to the service is that the right management policies are bound to the information no matter where it resides or to where it is moved. AD RMS is used to lock down documents, spreadsheets, e-mail, and so on from being infiltrated or ending up in the wrong hands. AD RMS, for example, prevents e-mails from being accidentally forwarded to the wrong people.

The Application Server role
This role supports the deployment and operation of custom business applications that are built with Microsoft .NET Framework. The Application Server role lets you choose services for applications that require COM+, Message Queuing, Web services, and Distributed Coordinated Transactions.

DHCP and DNS
These two roles install these two critical network service services required for every network. They support Active Directory integration and support IPv6. WINS is not classified as a key role for Windows Server 2008, and you install it as a feature, discussed later.

Fax Server role
The fax server lets you set up a service to send and receive faxes over your network. The role creates a fax server and installs the Fax Service Manager and the Fax service on the server.

File Server role
This role lets you set up all the bits, bells, and whistles that come with a Windows file server. This role also lets you install Share and Storage Management, the Distributed File System (DFS), the File Server Resource Manager application for managing file servers, Services for Network File System (NFS), Windows File Services, which include stuff like the File Replication Service (FRS), and so on.

Network Policy and Access Services
This provides the following network connectivity solutions: Network Access Protection (NAP), the client health policy creation, enforcement, and remediation technology; secure wireless and wired access (802.1X), wireless access points, remote access solutions, virtual private network (VPN) services, Radius, and more.

Print Management role
The print services provide a single interface that you use to manage multiple printers and print servers on your network.

Terminal Services role
This service provides technologies that enable users to access Windows-based programs that are installed on a terminal server. Users can execute applications remotely (they still run on the remote server) or they can access the full Windows desktop on the target server.

Universal Description, Discovery, and Integration (UDDI)
UDDI Services provide capabilities for sharing information about Web services. UDDI is used on the intranet, between entities participating on an extranet, or on the Internet.

Web Server role
This role provides IIS 7.0, the Web server, ASP.NET, and the Windows Communication Foundation (WCF).

Windows Deployment Services
These services are used for deployment of new computers in medium to large organizations.

Features

Server Manager also lets you install dozens of features on Windows Server 2008. These so-called features are actually programs or supporting layers that support or augment the functionality of one or more roles, or simply add to the functionality of the server. A good example of a feature is the clustering service. Now called Failover Clustering, this feature can be used to support mission-critical roles such as File Services, Printer Services, and DHCP Server, on server clusters. This provides for higher availability and performance.

Other features you will likely install include SMTP Server, Telnet Client and Server, Group Policy Management (for use with Active Directory), Remote Assistance, and more.

Posted in Windows 2008 | Tagged: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a Comment »

How to install IIS7 on Windows 2008 Core

Posted by Alin D on August 20, 2010

This is the full IIS installation, which installs all available feature packages for Server Core. If there are feature packages you do not need, you should edit the script to install only the packages you require. The default IIS installation installs a minimal set of available feature packages.

If you want to install IIS 7 components that rely on the .NET Framework, you must first install the .NET Framework. The components that rely on the .NET Framework will not be installed if the .NET Framework is not already installed.

To use a script to install the .NET Framework and the full IIS 7.5 installation on Server Core, type the following command into a script:
CMD /C START /w PKGMGR.EXE /l:log.etw /iu:IIS-WebServerRole;IIS-WebServer;IIS-CommonHttpFeatures;IIS-StaticContent;IIS-DefaultDocument;IIS-DirectoryBrowsing;IIS-HttpErrors;IIS-HttpRedirect;IIS-ApplicationDevelopment;IIS-ASP;IIS-CGI;IIS-ISAPIExtensions;IIS-ISAPIFilter;IIS-ServerSideIncludes;IIS-HealthAndDiagnostics;IIS-HttpLogging;IIS-LoggingLibraries;IIS-RequestMonitor;IIS-HttpTracing;IIS-CustomLogging;IIS-ODBCLogging;IIS-Security;IIS-BasicAuthentication;IIS-WindowsAuthentication;IIS-DigestAuthentication;IIS-ClientCertificateMappingAuthentication;IIS-IISCertificateMappingAuthentication;IIS-URLAuthorization;IIS-RequestFiltering;IIS-IPSecurity;IIS-Performance;IIS-HttpCompressionStatic;IIS-HttpCompressionDynamic;IIS-WebServerManagementTools;IIS-ManagementScriptingTools;IIS-IIS6ManagementCompatibility;IIS-Metabase;IIS-WMICompatibility;IIS-LegacyScripts;WAS-WindowsActivationService;WAS-ProcessModel;IIS-FTPServer;IIS-FTPSvc;IIS-FTPExtensibility;IIS-WebDAV;IIS-ASPNET;IIS-NetFxExtensibility;WAS-NetFxEnvironment;WAS-ConfigurationAPI;IIS-ManagementService;MicrosoftWindowsPowerShell;NetFx2-ServerCore;NetFx2-ServerCore-WOW64
To use a script for the full IIS 7.5 installation on Server Core, type the following command into a script:
CMD /C START /w PKGMGR.EXE /l:log.etw /iu:IIS-WebServerRole;IIS-WebServer;IIS-CommonHttpFeatures;IIS-StaticContent;IIS-DefaultDocument;IIS-DirectoryBrowsing;IIS-HttpErrors;IIS-HttpRedirect;IIS-ApplicationDevelopment;IIS-ASP;IIS-CGI;IIS-ISAPIExtensions;IIS-ISAPIFilter;IIS-ServerSideIncludes;IIS-HealthAndDiagnostics;IIS-HttpLogging;IIS-LoggingLibraries;IIS-RequestMonitor;IIS-HttpTracing;IIS-CustomLogging;IIS-ODBCLogging;IIS-Security;IIS-BasicAuthentication;IIS-WindowsAuthentication;IIS-DigestAuthentication;IIS-ClientCertificateMappingAuthentication;IIS-IISCertificateMappingAuthentication;IIS-URLAuthorization;IIS-RequestFiltering;IIS-IPSecurity;IIS-Performance;IIS-HttpCompressionStatic;IIS-HttpCompressionDynamic;IIS-WebServerManagementTools;IIS-ManagementScriptingTools;IIS-IIS6ManagementCompatibility;IIS-Metabase;IIS-WMICompatibility;IIS-LegacyScripts;WAS-WindowsActivationService;WAS-ProcessModel;IIS-FTPServer;IIS-FTPSvc;IIS-FTPExtensibility;IIS-WebDAV;IIS-ASPNET;IIS-NetFxExtensibility;WAS-NetFxEnvironment;WAS-ConfigurationAPI;IIS-ManagementService;MicrosoftWindowsPowerShell
To use a script for the default installation on Server Core, type the following command into a script:
start /w pkgmgr /l:log.etw /iu:IIS-WebServerRole;WAS-WindowsActivationService;WAS-ProcessModel;WAS-NetFxEnvironment;WAS-ConfigurationAPI
Install Roles and Services

1. Use the command oclist to list the available and installed roles and services on the server. The oclist command also renders component dependencies.

In the figure above, the oclist output shows that IIS-FTPExtensibility is dependent on IIS-FTPSvc. To install IIS-FTPExtensibility, it is first necessary to install IIS-FTPSvc.

2. Use the ocsetup command to install and uninstall individual roles and services.

3. Next,run oclist | more to verify which IIS components have been installed.
Install the .NET Framework

If you plan to use ASP.NET or IIS Remote Management then it is necessary to install .NET Framework first. To install it use the following commands:
start /w ocsetup NetFx2-ServerCore
start /w ocsetup NetFx2-ServerCore-WOW64

Install ASP.NET

1. Install ASP.NET by running the following commands (in order):
start /w ocsetup WAS-NetFxEnvironment
start /w ocsetup IIS-ISAPIExtensions
start /w ocsetup IIS-ISAPIFilter
start /w ocsetup IIS-NetFxExtensibility
start /w ocsetup IIS-ASPNET

Install Windows PowerShell and IIS Snap-In

1. Install Windows PowerShell by running the following command:
start /w ocsetup MicrosoftWindowsPowerShell
2. Next, start Windows PowerShell with the following command:
windowssystem32WindowsPowerShellv1.0powershell.exe
You should see a PowerShell prompt.

3. In order to enable the IIS snap-in, you must change the script execution policy by running this command:
Set-ExecutionPolicy RemoteSigned
4. Restart PowerShell for the policy changes to take effect. After restarting PowerShell, import the IIS snap-in:
import-module WebAdministration
5. You can obtain the list of available IIS cmdlets by typing:
get-command –pssnapin WebAdministration
Enable IIS Remote Management

Because Windows Server 2008 R2 Server Core does not have a graphical user interface (GUI), the command prompt must be used for administrative tasks. It may be more convenient to manage Server Core from another computer using IIS remote management.

The IIS Manager for Remote Administration:

* Remotely manages IIS 7 from Windows® 7, Windows Vista®, Windows® XP, and Windows Server® 2003.
* Connects directly to a Web server, Web site, or Web application.
* Installs even when IIS 7 is not installed on the local computer.
* Allows multiple simultaneous connections.
* Supports delegated administration to Web sites and Web applications, so owners can connect to and manage their own site directly.
* Is a familiar and easy-to-use administration tool.
* Supports HTTP over Secure Sockets Layer (SSL) for more secure management.
* Automatically downloads features to the local IIS Manager for Remote Administration console to match features newly installed on the remote Web server.

1. By default, Remote Desktop is not enabled on the Server Core. Install the IIS remote management service by using the following command:
start /w ocsetup IIS-ManagementService
2. Enable remote management with the following command:
reg add HKEY_LOCAL_MACHINESOFTWAREMicrosoftWebManagementServer ^
/v EnableRemoteManagement /t REG_DWORD /d 1

Start the management service by typing:
net start wmsvc
4. Connect to the IIS on the Server Core from a remote machine by using IIS Manager for Remote Administration.

To uninstall the Web Server (IIS) role, use the following command:
start /w pkgmgr /uu:IIS-WebServerRole;WAS-WindowsActivationService;WAS-ProcessModel

Posted in Scripting | Tagged: , , , , , , , , , , , , , , , , , , , , , , | Leave a Comment »

Windows Azure Storage

Posted by Alin D on August 17, 2010

Windows Azure Overview

Before I begin to build the application, a quick overview of Windows Azure and Roles is necessary. There are many resources available to describe these, so I wouldn’t go into a lot of detail here.

Windows Azure is Microsoft’s Cloud Computing offering that serves as the development, service host, and service management environment for the Windows Azure Platform. The Platform is comprised of three pieces: Windows Azure, SQL Azure, and AppFabric.

  • Windows Azure: Cloud-based Operating System which provides a virtualized hosting environment, computing resources, and storage.
  • SQL Azure: Cloud-based relational database management system that includes reporting and analytics.
  • AppFabric: Service bus and access control for connecting distributed applications, including both on-premise and cloud applications.

Windows Azure Roles

Unlike security related roles that most developers may be familiar with, Windows Azure Roles are used to provision and configure the virtual environment for the application when it is deployed. The figure below shows the Roles currently available in Visual Studio 2010.

Roles

Except for the CGI Web Role, these should be self-explanatory. The CGI Web Role is used to provide an environment for running non-ASP.NET web applications such as PHP. This provides a means for customers to move existing applications to the cloud without the cost and time associated with rewriting them in .NET.

Building the Azure application

The first step is, of course, to create the Windows Azure application to use for this demonstration. After the prerequisites have been installed and configured, you can open Visual Studio and take the normal path to create a new project. In the New Project dialog, expand the Visual C# tree, if not already, and click Cloud. You will see one template available, Windows Azure Cloud Service. Note that although .NET Framework 4 is selected, Windows Azure does not support 4.0 yet and the projects will default to .NET Framework 3.5.

New Project

After selecting this template, the New Cloud Service Project dialog will be displayed, listing the available Windows Azure Roles. For this application, select an ASP.NET Web Role and a Worker Role. After the roles have been added to the Cloud Service Solution list, you can rename them by hovering over the role to display the edit link. You can, of course, add additional Roles after the solution has been created.

Cloud Service Project

After the solution has been created, you will see three projects in the Solution Explorer.

Solution Explorer

As this article is about Azure Storage rather than Windows Azure itself, I’ll briefly cover some of the settings but leave more in-depth coverage for other articles or resources.

Under the Roles folder, you can see two items, one for each of the roles that were added in the previous step. Whether you double click the item or right-click and select Properties from the context menu, it will open the Properties page for the given role. The below image is for the AzureStorageWeb Role.

Properties

The first section in the Configuration tab is to select the trust level for the application. These settings should be familiar to most .NET developers. The Instances section tells the Windows Azure Platform how many instances to of this role to create and the size of the Virtual Machine to provision. If this Web Role were for a high volume web application, then selecting a high number of instances would improve its availability. Windows Azure will handle the load balancing for all of the instances that are created. The VM sizes are as follows:

  • Small: 1 core processor, 1.75 GB RAM, 250 GB hard drive
  • Medium: 2 core processor, 3.5 GB RAM, 500 GB hard drive
  • Large: 4 core processor, 7 GB RAM, 1000 GB hard drive
  • Extra large: 8 core processor, 15 GB RAM, 2000 GB hard drive

The Startup action is specific to Web Roles and, as you can see, allows you to designate whether the application is accessed via HTTP or HTTPS.

Settings

The Settings tab should be familiar to .NET developers, and is were any additional settings for the application can be created. Any settings added here will be placed in the ServiceConfiguration and ServiceDefinition files since they apply to the service itself, not specifically to a role project. Of course, the projects also have the web.config and app.config files that are specific to them.

EndPoints

The EndPoints tab allows you to configure the endpoints that will be configured and exposed for the Role. In this case, the Web Role can be configured for HTTP or HTTPS with a specific port and SSL certificate if appropriate.

EndPoints

As you can see here, the Worker Role has a different Endpoints screen. The types available from the dropdown are Input and Internal, and the Protocol dropdown includes, http, https, and tcp as the default. This allows you to connect to the Worker Role via any of these protocols and expose the functionality externally if necessary.

Web Role

Since this article is meant to focus on Azure Storage, I’ll keep the UI simple. However, thanks to JQuery and some styles, a simple interface can still look good for little effort.

UI

There is nothing special about the web application, it is just like any other web app you have built. There is one class that is unique to Azure, however, the WebRole class. All Roles in Windows Azure must have a class that derives from RoleEntryPoint. This class is used by Windows Azure to initialize and control the application. The default implementation provides an override for the OnStart method and assigns a handler for the RoleEnvironmentChanging event. This will allow the Role to be restarted if the configuration changes, such as increasing the instance count or adding a new setting. If there were other actions necessary to be taken before the application started, they should be handled here. Likewise, the Run and OnStop methods can be overridden to perform an action before the application is run and before it is stopped, respectively.

Collapse
public override bool OnStart()
{
    DiagnosticMonitor.Start("DiagnosticsConnectionString");

    // For information on handling configuration changes
    // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.
    RoleEnvironment.Changing += RoleEnvironmentChanging;

    return base.OnStart();
}

private void RoleEnvironmentChanging(object sender, RoleEnvironmentChangingEventArgs e)
{
    // If a configuration setting is changing
    if(e.Changes.Any(change => change is RoleEnvironmentConfigurationSettingChange))
    {
        // Set e.Cancel to true to restart this role instance
        e.Cancel = true;
    }
}

Azure Storage

As I’ve said, there are three types of storage available with the Windows Azure Platform: blob, table, and queue.

Blob Storage

Binary Large Object, or blob, should be familiar to most developers and is used to store things like images, documents, or videos; something larger than a name or ID. Blob storage is organized by containers that can have two types of blob: Block and Page. The type of blob needed depends on its usage and size. Block blobs are limited to 200 GB, while Page blobs can go up to 1 TB. Note, however, that in development, storage blobs are limited to 2 GB. Blob storage can be accessed via RESTful methods with a URL such as: http://myapp.blob.core.windows.net/container_name/blob_name.

Although blob storage isn’t hierarchical, it can be simulated by the name. Blob names can use /, so you can have names such as:

Here it appears that the blobs are organized by year, month, and day; however, in reality, the names of blobs are like 2009/10/4/photo1, 2009/10/4/photo2, and 2008/6/25/photo1.

Block Blob

Although a Block blob can be up to 200 GB, if it is larger than 64 MB, it must be sent in multiple chunks of no more than 4 MB. Storing a Block blob is also a two-step process; the block must be committed before it becomes available. When a Block blob is sent in multiple chunks, they can be sent in any order. The order in which the Commit call is made determines how the blob is assembled. Thankfully, as we’ll see later, the Azure Storage API hides these details so you won’t have to worry about them unless you want to.

Page Blob

A Page blob can be up to 1 TB in size, and is organized into 512 byte pages within the block. This means any point in the blob can be accessed for read or write operations by using the offsite from the start of the blob. This is the advantage to using a Page blob rather than a Block blob, which can only be accessed as a whole.

Table Storage

Azure tables are not like tables from an RDBMS like SQL server. They are composed of a collection of entities and properties, with properties further containing collections of name, type, and value. The thing to realize, and what may cause a problem for some developers, is that Azure tables can’t be accessed using ADO.NET methods. As with all other Azure storage methods, RESTful access is provided: http://myapp.table.core.winodws.net/TableName.

I’ll cover tables in-depth later when getting to the actual code.

Queue Storage

Queues are used to transport messages between applications, Azure based or not. Think of Microsoft Messaging Queue, MSMQ, for the cloud. As with the other storage type, RESTful access is available as well: http://myapp.queue.core.windows.net/Queuename.

Queue messages can only be up to 8 KB; remember, it isn’t meant to transport large objects, only messages. However, the message can be a URI to a blob or table entity. Where Azure Queues differ from traditional queue implementations is that it is not a FIFO container. This means, the message will remain in the queue until explicitly deleted. If a message is read by one process, it will be marked as invisible to other processes for a variable time period, which defaults to 30 seconds, and can be no more than 2 hours; if the message hasn’t been deleted by then, it will be returned to the queue and will be available for processing again. Because of this behavior, there is also no guarantee that messages will be in any particular order.

Building the Storage Methods

To start with, I’ll add another project to the solution, a Class Library project. This project will serve as a container for the storage methods and implementation used in this solution. After creating the project, you’ll need to add references to the Windows Azure Storage assembly Microsoft.WindowsAzure.StorageClient.dll, which can be found in the Windows Azure SDK folder, C:Program FilesWindows Azure SDKv1.1ref StorageBase.

Since a CloudStorageAccount is necessary for any access, I’ll create a base class to contain a property for it.

Collapse
public static CloudStorageAccount Account
{
    get
    {
        // For development this can be used
        //return CloudStorageAccount.DevelopmentStorageAccount;
        // or this so code doesn't need to be changed before deployment
        return CloudStorageAccount.FromConfigurationSetting("DiagnosticsConnectionString");
    }
}

You’ll see here that we can use two methods to return the CloudStorageAccount object. Since the application is being run in a development environment, we could use the first method and return the static property DevelopmentStorageAccount. However, before deployment, this would need to be updated to an actual account. Using the second method, however, the account information can be retrieved from the configuration file, similar to database connection strings in an app.config or web.config file. Before the FromConfigurationSetting method can be used though, we must add some code to the OnStart method of the WebRole class.

Collapse
// This code is necessary to use CloudStorageAccount.FromConfigurationSetting
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
    configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
    RoleEnvironment.Changed += (sender, arg) =>
    {
        if(arg.Changes.OfType<roleenvironmentconfigurationsettingchange>()
            .Any((change) => (change.ConfigurationSettingName == configName)))
        {
            if(!configSetter(RoleEnvironment.GetConfigurationSettingValue(configName)))
            {
                RoleEnvironment.RequestRecycle();
            }
        }
    };
});

This code basically tells the runtime to use the configuration file for setting information, and also sets an event handler for the RoleEnvironment.Changed event to detect any changes to the configuration file. If a change is detected, the Role will be restarted so those changes can take effect. This code also makes the default RoleEnvironment.Changing event handler implementation unnecessary since they both do the same thing, restarting the role when a configuration change is made.

Implementing Blob Storage

The first thing we need is a reference to a CloudBlobClient object to access the methods. As you can see, there are two ways to do this. Both produce the same result; one is just less typing, but gives more control over the creation.

Collapse
public static CloudBlobClient Client
{
    get
    {
        //return new CloudBlobClient(Account.BlobEndpoint.AbsoluteUri,
        //                           Account.Credentials);

        // More direct method
        return Account.CreateCloudBlobClient();
    }
}

Uploading the blob is a relatively easy task.

Collapse
public void PutBlobBlock(Stream stream, string fileName)
{
    // This method returns true if the container did not exist and was created
    // but for this purpose it doesn't matter.
    Client.GetContainerReference(CONTAINER_NAME).CreateIfNotExist();

    // Now that the container has been created if necessary
    // we can upload the blob
    Client.GetContainerReference(CONTAINER_NAME)
        .GetBlobReference(fileName)
        .UploadFromStream(stream);
}

As you can see, the first step is to retrieve a reference to the container. The CreateIfNotExist method is a convenience that, as the name implies, will create the container if it doesn’t already exist. An alternative approach would be as follows:

Collapse
CloudBlobContainer container = Client.GetContainerReference(CONTAINER_NAME);
if(container == null)
{
    Client.GetContainerReference(CONTAINER_NAME).Create();
}

After you have a reference to the container, the next step is to get a reference to the blob. If a blob already exists with the specified name, it will be overwritten. After obtaining a reference to the CloudBlob object, it’s just a matter of calling the appropriate method to upload the blob. In this case, I’ll use the UploadFromStream method since the file is coming from the ASP.NET Upload control as a stream; however, there are other methods depending the environment and usage, such as UploadFile, which uses the path of a physical file. All of the upload and download methods also have asynchronous counterparts.

One thing to note here is that the container names must be lowercase. If trying a name with capitalization, you will receive a rather cryptic and uninformative StorageClientException with the message “One of the request inputs is out of range.” Further, the InnerException will a WebException with the message “The remote server returned an error: (400) Bad Request.

Implementing Table Storage

Of the three storage types, Azure Table Storage requires the most setup. The first thing necessary is to create a model for the data that will be stored in the table.

Collapse
public class MetaData : TableServiceEntity
{
    public MetaData()
    {
        PartitionKey = "MetaData";
        RowKey = "Not Set";
    }

    public string Description { get; set; }
    public DateTime Date { get; set; }
    public string ImageURL { get; set; }
}

For this demonstration, the model is very simple, but, most importantly, it derives from TableServiceEntity which tells Azure the class represents a table entity. Although Azure Table Storage is not a relational database, there must be some mechanism to uniquely identify the rows that are stored in a table. The PartitionKey and RowKey properties from the TableServiceEntity class are used for this purpose. The PartitionKey itself is used to partition the table data across multiple storage nodes in the virtual environment, and, although an application can use one partition for all table data, it may not be the best solution for scalability and performance.

Windows Azure Table Storage is based on WCF Data Services (formerly, ADO.NET Data Services), so there needs to be some context for the table. The TableServiceContext class represents this, so I’ll derive a class from it.

Collapse
public class MetaDataContext : TableServiceContext
{
    private const string TABLE_NAME = "MetaData";

    public MetaDataContext(string baseAddress, StorageCredentials credentials)
        : base(baseAddress, credentials)
    {
        CloudTableClient.CreateTablesFromModel(typeof(MetaDataContext),
                                               baseAddress, credentials);
    }
}

Within the constructor, I’ll make sure the table has also been constructed, so it will be available when necessary. This could, of course, also be done in the RoleEntryPoint OnStart method if the table may be used in multiple classes.

Collapse
public void Add(MetaData data)
{
    // RowKey can't have / so replace it
    data.RowKey = data.RowKey.Replace("/", "_");
    AddObject(ENTITY_NAME, data);
    SaveChanges();
}

Adding to the table should be very familiar to anyone who has worked with LINQ to SQL or Entity Framework. You add the object to the data context, then save all the changes. Note here the RowKey naming. Since I’m using the date for the filename, I need to make a slight modification since RowKey can’t contain “/” characters.

Collapse
public IQueryable<MetaData> MetaData
{
    get { return CreateQuery<MetaData>(ENTITY_NAME); }
}

Getting to the contents of the table is a matter of creating a DataServiceQuery for the model and specifying the EntitySet you are interested in. From there, you can use LINQ to access a particular item.

Collapse
public MetaData GetMetaData(string key)
{
    return (from e in Context.MetaData
            where e.RowKey == key && e.PartitionKey == "MetaData"
            select e).SingleOrDefault();
}

Implementing Queue Storage

Queue storage is probably the easiest part to implement. Unlike Table storage, there is no need to setup a model and context, and unlike Blob storage, there is no need to be concerned with blocks and pages. Queue storage is only meant to store small messages, 8 KB or less. Adding a message to a Queue follows the same pattern as the other storage mechanisms. First, get a reference to the Queue, creating it if necessary, then add the message.

Collapse
public void Add(CloudQueueMessage msg)
{
    Client.GetQueueReference(QUEUE_NAME).CreateIfNotExist();

    Client.GetQueueReference(QUEUE_NAME).AddMessage(msg);
}

Retrieving a message from the Queue is just as easy. First, check if the Queue exists, then also make sure a message exists on the Queue before attempting to retrieve it.

Collapse
public CloudQueueMessage GetNextMessage()
{
    CloudQueueMessage msg = null;

    if(Client.GetQueueReference(QUEUE_NAME) != null)
    {
        if(Client.GetQueueReference(QUEUE_NAME).PeekMessage() != null)
        {
            msg = Client.GetQueueReference(QUEUE_NAME)
                .GetMessage();
        }
    }

    return msg;
}

Worker Role

Now we can finally get to the Worker Role. To demonstrate how a Worker Role can be incorporated into a project, I’ll use it to add a watermark to the images that have been uploaded. The Queue that was previously created will be used to notify this Worker Role when it needs to process an image and which one to process.

Just as with the Web Role, the OnStart method is used to setup and configure the environment. Worker Roles have the additional method, Run, which simply creates a loop and continues indefinitely. It’s somewhat odd to not have an exit condition; instead, when Stop is called for this role, it forcibly terminates the loop, which may cause issues for any code running in it.

Collapse
public override void Run()
{
    // This is a sample worker implementation. Replace with your logic.
    Trace.WriteLine("AzureStorageWorker entry point called", "Information");

    while(true)
    {
        PhotoProcessing.Run();

        Thread.Sleep(10000);
        Trace.WriteLine("Working", "Information");
    }
}

You can view the sample code for this article to see the details of PhotoProcessing.Run. It simply gets the blob indicated in the QueueMessage, adds a watermark, and updates the Blob storage.

Putting it all Together

Now that everything has been implemented, it’s just a matter of putting it all together. Using the Click event for the Upload button on the ASPX page, I’ll get the file that is being uploaded and the other pertinent details. The first step is to upload the blob so we can get the URI that points to it and add it to Table storage along with the description and date. The final step is adding a message to the Queue to trigger the worker process.

Collapse
protected void OnUpload(object sender, EventArgs e)
{
    if(FileUpload.HasFile)
    {
        DateTime dt = DateTime.Parse(Date.Text);
        string fileName = string.Format("{0}_{1}",
           dt.ToString("yyyy/MM/dd"), FileUpload.FileName);

        // Upload the blob
        Storage.Blob blobStorage = new Storage.Blob();
        string blobURI =
          blobStorage.PutBlob(FileUpload.PostedFile.InputStream, fileName);


        // Add entry to table
        Storage.Table tableStorage = new Storage.Table();

        tableStorage.Add(new Storage.MetaData
            {
                Description = Description.Text,
                Date = dt,
                ImageURL = blobURI,
                RowKey = fileName
            }
        );

        // Add message to queue
        Storage.Queue queueStorage = new Storage.Queue();
        queueStorage.Add(new CloudQueueMessage(blobURI + "$" + fileName));

        // Reset fields
        Description.Text = "";
        Date.Text = "";
    }
}

As I said, the UI is very simple, with the focus being on the underlying processes for Azure Storage.

Conclusion

Hopefully, this article has given you an overview of what Windows Azure Storage is and how it can be used. There is, of course, much more that can be covered on this topic, that may be covered in follow-up articles. However, here are some resources that can provide you with additional information and insight about Windows Azure and Windows Azure Storage.

Points of Interest

Names for Tables, Blob containers, and Queues seem to have a mixture of support for uppercase names. It would seem the best approach is to always use lowercase.

Posted in Azure | Tagged: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a Comment »