Thursday, April 24, 2014

Reading ASP.NET web.config file (ConnectionString and AppSettings) from classic ASP (.asp)

Classic Microsoft ASP is still out there and being used. In most of the classic projects I've been involved in the classic website is in a state of migration to a new ASP.NET website. In most cases in a hybrid form. Moving forward on the project requires unifying the configuration so that the two applications can co-exist and run as one application. 

To make that easier I wrote a simple classic asp class that could read connection string and app settings from a .NET web.config file.

The class is designed to be able to support multiple different web.config files (and in the case connection strings) support a config source. This can allow you to externalize your connection strings from your web.config files.


Usage:


The class is initialized the same way as any other object in classic asp:
Dim config 
Set config = new Configuration
It should be disposed of like any classic asp object:
Set config = Nothing
By default the class is initialized with a default configuration file named “web.config”. You can modify it by setting the configuration file property:
Dim config 
Set config = new Configuration

config.ConfigurationFile = "web.config"
Reading a app setting:
Dim config 
Set config = new Configuration

config.AppSetting("appLevel")

Set config = Nothing
Reading a connection string:
Dim config 
Set config = new Configuration

config.ConnectionString("YourConnectionStringName")

Set config = Nothing
Note: This works for a referenced config source or embedded in configuration file


Code:

Class Configuration
    'class sub
    Private Sub Class_Initialize 
        m_ConfigurationFile = "web.config" 'set default configuration file (for web applications)
    End Sub

    'configuration file
    Private m_ConfigurationFile
    Public Property Get ConfigurationFile()
        ConfigurationFile = m_ConfigurationFile
    End Property
    Public Property Let ConfigurationFile(p_ConfigurationFile)
         m_ConfigurationFile = p_ConfigurationFile
    End Property

    Public Property Get ConfigurationFileFullPath()
        ConfigurationFileFullPath = Server.MapPath(m_ConfigurationFile)
    End Property

    ''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
    ' Business/Feature Functions
    ''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
    Public Function AppSetting(p_AppSettingName)
        Dim objXMLDoc
        Set objXMLDoc = LoadConfigurationXmlFile() 'load the configuration file

        Dim n_AppSettings
        Set n_AppSettings = objXMLDoc.selectSingleNode("configuration/appSettings/add[@key='" + p_AppSettingName + "']")

        If IsNull(n_AppSettings) Or (varType(n_AppSettings) = vbEmpty) Or (IsObject(n_AppSettings) = False) Or (n_AppSettings is Nothing) Then
            Call Err.Raise(15002, "AppSetting", "AppSetting " + p_AppSettingName + " does not exist")
        Else
            AppSetting = n_AppSettings.GetAttribute("value")
        End If

        Set n_AppSettings = Nothing
        DestroyXmlObject(objXMLDoc)
    End Function

    'ConnectionString
    'Read a connection string from a web.config (or a referenced .config file)
    Public Function ConnectionString(p_ConnectionStringName)
        Dim objXMLDoc
        Set objXMLDoc = LoadConfigurationXmlFile() 'load the configuration file
        
        Dim n_ConnectionString
        Dim s_attributeConfigSource
                
        Set n_ConnectionString = objXMLDoc.selectSingleNode("configuration/connectionStrings")
        s_attributeConfigSource = n_ConnectionString.GetAttribute("configSource")
            
        If IsNull(s_attributeConfigSource) Or s_attributeConfigSource = "" Then
            ConnectionString = ReadFromConnectionStringNode(n_ConnectionString, p_ConnectionStringName) 
        Else
            'response.Write(s_attributeConfigSource)
            Dim objXMLConnectionStringDoc

            Set objXMLConnectionStringDoc = CreateXmlObject()
            objXMLConnectionStringDoc.load Server.MapPath(s_attributeConfigSource) 'load the configuration file

            Set n_ConnectionString = objXMLConnectionStringDoc.selectSingleNode("connectionStrings") 'set the connection string node to read from this file.

            ConnectionString = ReadFromConnectionStringNode(n_ConnectionString, p_ConnectionStringName) 

            DestroyXmlObject(objXMLConnectionStringDoc)
        End If

        Set s_attributeConfigSource = Nothing
        Set n_ConnectionString = Nothing

        DestroyXmlObject(objXMLDoc)
    End Function

    'ReadFromConnectionStringNode
    'Query out connection string information from the connection string node. It will contain
    'zero to many connection strings so we will find the correct one by looking for the connection
    'string name using an XPath statement.
    'Errors:
    '15001 - connection string not found
    Private Function ReadFromConnectionStringNode(n_ConnectionString, p_ConnectionStringName)
        'query for the connection string information
        Dim n_configInfo
        Set n_configInfo = n_ConnectionString.selectSingleNode("add[@name='" + p_ConnectionStringName + "']")               

        If IsNull(n_configInfo) Or (varType(n_configInfo) = vbEmpty) Or (IsObject(n_configInfo) = False) Or (n_configInfo is Nothing) Then
            Call Err.Raise(15001, "ConnectionString", "Connection String " + p_ConnectionStringName + " does not exist")
        End If

        ReadFromConnectionStringNode = n_configInfo.GetAttribute("connectionString") 'return the connection string information from the xml file

        Set n_configInfo = Nothing
    End Function

    ''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
    ' Utility Functions
    ''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
    Private Function LoadConfigurationXmlFile()
        Dim objXMLDoc
        Set objXMLDoc = CreateXmlObject()   
        objXMLDoc.load Me.ConfigurationFileFullPath() 'load the configuration file

        Set LoadConfigurationXmlFile = objXMLDoc
    End Function

    Private Function CreateXmlObject()
        Dim objXMLDoc
        Set objXMLDoc = Server.CreateObject("MSXML2.DOMDocument.3.0")    
        objXMLDoc.async = False    

        Set CreateXmlObject = objXMLDoc
    End Function

    Private Function DestroyXmlObject(p_XmlObject)
        Set p_XmlObject = Nothing
    End Function
End Class
Key Words:
Classic asp, web.config, classic asp web.config, classic asp read connection string from web.config, classic asp read appsettings from web.config

References:
http://en.wikipedia.org/wiki/Active_Server_Pages
http://en.wikipedia.org/wiki/Asp.net

Wednesday, August 8, 2012

The current build operation (build key Build Key[Microsoft.Practices.EnterpriseLibrary.Data.Database, null]) failed: The value can not be null or an empty string.


This error occurs when using the Microsoft EnterpriseLibrary Data Application block. Specifically, when attempting to create a database using the database factory.
Example:
Database db = DatabaseFactory.CreateDatabase();

The Error Message: The current build operation (build key Build Key[Microsoft.Practices.EnterpriseLibrary.Data.Database, null]) failed: The value can not be null or an empty string.

The Resolution:
EnterpriseLibrary Data Application block requires connection strings be configured. In this case, without specifying a connection string, the DatabaseFactory is attempting to use the default database.

In order to use a default database, you need to add the dataConfiguration element to your application configuration file.

The data configuration element contains a single attribute defaultDatabase which contains the name of the connection string to use as the default.

app.config file containing dataConfiguration element
Resources:Database Factory Class on MSDN



This is a simple example of a app.config file that contains the necessary elements:
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  <configSections>
    <section name="dataConfiguration" type="Microsoft.Practices.EnterpriseLibrary.Data.Configuration.DatabaseSettings, Microsoft.Practices.EnterpriseLibrary.Data" />
  </configSections>
  <connectionStrings>
    <add name="MyDatabase" connectionString="Initial Catalog=MyApplicationDatabase; Data Source=MyApplicationDatabaseServer; User ID=MyUserName; Password=MyPassword" providerName="System.Data.SqlClient"/>
 </connectionStrings>
 <dataConfiguration defaultDatabase="MyDatabase"/>
</configuration>

Wednesday, February 8, 2012

IE 8 Developer Tools Window Not Displaying

When using IE 8, you try to open the Developer Tools (F12 or via Tools -> Developer Tools). Either way the Developer Tools show up on the Task Bar, but the window will not be displayed. Open, close, restart and nothing fixes it.

It actually appears that the Developer Tools are moving off the active screen (and in a minimized state):

So what is actually happening:
The developer tools are opening (just not on the active screen).

In order to bring it up:

  • Select 'Developer Tools' as your active window (Hotkey: Alt+Tab)
  • Type Alt+Space to bring up the system menu
  • Select "Move" and then re-position the Developers tools to the active screen
  • Re-size the window

Monday, January 30, 2012

ILMerge: Getting Started, Merging, and Alternatives (to ILMerge).

What is ILMerge?
ILMerge is a Microsoft produced assembly merge utility. It merges more-than-one .NET assembly (or executables) into a single assembly.

ILMerge is a command-line utility.

Installation:
To begin you will need to install ILMerge on your machine (or the machine where you are planning on merging assemblies). You can download ILMerge directly from Microsoft in the Download center at: http://www.microsoft.com/download/en/confirmation.aspx?id=17630

The installation itself is a Microsoft Installer (MSI) file named ILMerge.msi. Click on the file, and follow the pretty typical installation path: Welcome -> License Agreement (EULA) -> Install Folder > Confirm

Note: You will want to remember the installation folder, since you will have to run ILMerge from this location (or at a minimum add this file location into the windows path). My install location was C:\Program Files (x86)\Microsoft\ILMerge

Best Practices:

  1. Script the merge process
  2. Incorporate the assembly merge into your build process

Examples:
Merge:
ilmerge /out:Merged.dll Primary.dll Secondary1.dll Secondary2.dll 
Merge with Wildcard:
ilmerge /wildcards /out:Merged.dll Primary.dll Secondary*.dll 
Merge with Log:
ilmerge /log /out:Merged.dll Primary.dll Secondary1.dll Secondary2.dll
ilmerge /log:log.txt /out:Merged.dll Primary.dll Secondary1.dll Secondary2.dll

IL Merge with .NET 4.0:
... when using ILMerge with .NET 4.0 assemblies you have to set the target platform to .NET 4.0 (command line parameter: /targetplatform:v4).

... Microsoft Research (via Mike Barnett) provides an example configuration file for using ILMerge with .NET 4.0 at: http://research.microsoft.com/en-us/people/mbarnett/ilmerge-40-exe-config.aspx

This is the example:
<?xml version ="1.0"?>
<configuration>
<startup useLegacyV2RuntimeActivationPolicy="true">
<requiredRuntime safemode="true" imageVersion="v4.0.30319" version="v4.0.30319"/>
</startup>
</configuration>
Source: http://research.microsoft.com/en-us/people/mbarnett/ilmerge-40-exe-config.aspx 

Are there alternatives to ILMerge (that will accomplish the same relative goal)?


Assemblies as Resources
Jeffrey Richter on the Microsoft Press blog proposed adding assemblies as resources, and loading them via reflection. You can read more about this approach here: http://blogs.msdn.com/b/microsoft_press/archive/2010/02/03/jeffrey-richter-excerpt-2-from-clr-via-c-third-edition.aspx

Using this approach, you "hook" into the assembly resolve for the AppDomain, and then load the assemblies via reflection.
AppDomain.CurrentDomain.AssemblyResolve += (sender, args) => {
   String resourceName = "AssemblyLoadingAndReflection." + new AssemblyName(args.Name).Name + ".dll";

   using (var stream = Assembly.GetExecutingAssembly().GetManifestResourceStream(resourceName)) {
      Byte[] assemblyData = new Byte[stream.Length];
      stream.Read(assemblyData, 0, assemblyData.Length);
      return Assembly.Load(assemblyData);
   }
};
Source: http://blogs.msdn.com/b/microsoft_press/archive/2010/02/03/jeffrey-richter-excerpt-2-from-clr-via-c-third-edition.aspx

This blog post (Combining multiple assemblies into a single EXE for a WPF application) contains a more comprehensive approach, and includes a way to incorporate this strategy into your build process via a AfterResolveReferences target:

<Target Name="AfterResolveReferences">
  <ItemGroup>
    <EmbeddedResource Include="@(ReferenceCopyLocalPaths)" Condition="'%(ReferenceCopyLocalPaths.Extension)' == '.dll'">
      <LogicalName>%(ReferenceCopyLocalPaths.DestinationSubDirectory)%(ReferenceCopyLocalPaths.Filename)%(ReferenceCopyLocalPaths.Extension)</LogicalName>
    </EmbeddedResource>
  </ItemGroup>
</Target>

RedGate Smart Assembly
Smart Assembly from Red Gate's description is "an obfuscator that helps protect your .NET code against reverse-engineering, cracking, and modification".

Smart Assembly also has the ability to combine multiple assemblies into a single one.  They describe this feature as "Simplify the deployment of your application by packaging it in one file (Dependency Embedding with compression and encryption, and Dependency Merging)."

Notes:
... ILMerge can not merge WPF assemblies (from a couple of sources). According to Mike Barnett the reason is "[WPF assemblies] contain resources with encoded assembly identities. ILMerge is unable to deserialize the resources, modify the assembly identities, and then re-serialize them. ". See Assemblies as Resources above for an alternative.


...  ILMerge cannot merge C++ assemblies containing native code


... Make sure you have accounted for assembly references before attempting to merge, or you will get an exception:
Missing Reference Assembly Screenshot

An exception occurred during merging:Unresolved assembly reference not allowed: System.Data.Entity.   at System.Compiler.Ir2md.GetAssemblyRefIndex(AssemblyNode assembly)   at System.Compiler.Ir2md.GetTypeRefIndex(TypeNode type)   at System.Compiler.Ir2md.VisitReferencedType(TypeNode type)   at System.Compiler.Ir2md.VisitClass(Class Class)   at System.Compiler.Ir2md.VisitModule(Module module)   at System.Compiler.Ir2md.SetupMetadataWriter(String debugSymbolsLocation)   at System.Compiler.Ir2md.WritePE(Module module, String debugSymbolsLocation,BinaryWriter writer)   at System.Compiler.Writer.WritePE(String location, Boolean writeDebugSymbols, Module module, Boolean delaySign, String keyFileName, String keyName)   at System.Compiler.Writer.WritePE(CompilerParameters compilerParameters, Module module)   at ILMerging.ILMerge.Merge()   at ILMerging.ILMerge.Main(String[] args)

... A few comments says it removes  XML Comments from the assembly.


Resources:
ILMerge Download - The Microsoft download center web page that provides the download link to install ILMerge. The actual installation is ILMerge.msi.
Mike Barnett ILMerge
ILMerge config file for executing within the CLR v4.0 runtime
Jeffrey Richter: Excerpt #2 from CLR via C#, Third Edition
Combining multiple assemblies into a single EXE for a WPF application
RedGate Smart Assembly
Tools & Utilities

Wednesday, January 25, 2012

log4Net: RollingFileAppender Class

What is log4net?
Apache log4net from their website: "The Apache log4net library is a tool to help the programmer output log statements to a variety of output targets. log4net is a port of the excellent Apache log4j™ framework to the Microsoft® .NET runtime."

What is the RollingFileAppender?
RollingFileAppender exists within the log4net in the log4net.Appender namespace. According to the website it's designed to append "log files based on size or date or both.". A pretty basic description, but it does describe what this logging type does: it appends we log messages to an existing log (or creates a new one if none exists), and when it reaches the roll condition it creates a new file.

The conditions under which a roll condition are reached are configured via the RollingStyle  property. You can set the log to role based upon a date, file size, or a combination of the two (known as a composite in the documentation).

Configuration for the RollingFileAppender:
Configuration for the RollingFileAppender is primarily done in an xml configuration file, or your application configuration file.

The good part of a configuration based approach is the ability to customize your logging based upon your life-cycle environment (think QA logging and production logging), or client your individual customer or client's needs.

Log4net also supports run time (or in-code) configuration as well. If your interested in this type of configuration, you can get started here: http://logging.apache.org/log4net/release/manual/configuration.html


In this example, the RollingFileAppender is configured to write the a file named log.txt, with the rolling condition of size, and specifies the maximum file size. One important thing to note is the   in the example below. This will cause log4net to append a count number to the end of the log file as it creates new ones (Example: log.txt.1, log.txt.2, log.txt.3)


<appender name="RollingFileAppender" type="log4net.Appender.RollingFileAppender"></appender><br />
    <file value="log.txt"></file><br />
    <appendtofile value="true"></appendtofile><br />
    <rollingstyle value="Size"></rollingstyle><br />
    <maxsizerollbackups value="10"></maxsizerollbackups><br />
    <maximumfilesize value="100KB"></maximumfilesize><br />
    <staticlogfilename value="true"></staticlogfilename><br />
    <layout type="log4net.Layout.PatternLayout"></layout><br />
        <conversionpattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline"></conversionpattern>

   


This example is a little different. It's rolling is determined by the date setting. Note the datePattern used to control the format of the date for file names:

<appender name="RollingLogFileAppender" type="log4net.Appender.RollingFileAppender">
    <file value="logfile">
    <appendtofile value="true">
    <rollingstyle value="Composite">
    <datepattern value="yyyyMMdd">
    <maxsizerollbackups value="10">
    <maximumfilesize value="1MB">
    <layout type="log4net.Layout.PatternLayout">
        <conversionpattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline">
    </conversionpattern></layout>
</maximumfilesize></maxsizerollbackups></datepattern></rollingstyle></appendtofile></file></appender>


This represents a typical rolling date configuration section for log4net (thanks to dommer):

<log4net>
    <appender name="RollingFile" type="log4net.Appender.RollingFileAppender">
        <file value="c:\Logs\Today.log"/>
        <rollingStyle value="Date"/>
        <datePattern value="yyyyMMdd"/>
        <appendToFile value="true"/>
        <layout type="log4net.Layout.PatternLayout">
            <conversionPattern value="%level %logger %date{ISO8601} - %message%newline"/>
        </layout>
    </appender>
    <root>
        <!-- Options are "ALL", "DEBUG", "INFO", "WARN", "ERROR", "FATAL" and "OFF". -->
        <level value="ERROR"/>
        <appender-ref ref="RollingFile"/>
    </root>
</log4net>

Consider your larger logging strategy with RollingFileAppender
Every single application should generate log entries. The world is an imperfect place, and the log at a minimum should tell you about those imperfections. However, depending on your logging "enthusiasm" and logging level you are going to write messages. Even a small application can generate a tremendous amount of logging data.

RollingFileAppender can help with that by:

  1. Naturally segmenting your data into smaller files (making a potential search later on) easier
  2. If you use the date RollingStyle then you will have files seperated by day
  3. RollingFileAppender can also manage the destruction of older log files with the maxSizeRollBackups. This will allow you to set the number of archive files to keep online.
If you combine these features with a strategy to potentially compress, and store files (or destroy them with maxSizeRollBackups) then you can have a self-managed and effective strategy to track messages.


Notes:
... RollingFileAppender full name is log4net.Appender.RollingFileAppender

... Build this into a larger logging and archiving strategy.

... Is log4Net's RollingFileAppender class any better than any other logging libraries rolling file appender? Just is still out.

Resources:
log4net
RollingFileAppender Class
log4net Manual Configuration
log4net Tutorial

Tuesday, January 24, 2012

Adding CDATA - (Unparsed) Character Data to XElement

When working with the System.Linq.Xml namespace, and creating an XDocument you may need to create create document data that  that should not be parsed by the XML parser. For example: data with XML reserved characters, xml content, html content (or really anything you do not want to be parsed).

Creating data within a CDATA tag is simple, simply wrap the content within the XElement in an additional object called XCData.
XDocument contactsDoc =
   new XDocument(
      new XDeclaration("1.0", "utf-8", "yes"),
      new XComment("LINQ to XML Contacts XML Example"),
      new XProcessingInstruction("MyApp", "123-44-4444"),
      new XElement("contacts",
         new XElement("contact",
            new XElement("name", "Patrick Hines"), 
            new XElement("comment", new XCData("ALERT! Do not call!")),
            new XElement("address",
               new XElement("street1", "123 Main St"),
               new XElement("city", "Mercer Island"),
               new XElement("state", "WA"),
               new XElement("postal", "68042")
            )
         )
      )
   );
Sample adapted from .NET Language-Integrated Query for XML Data 
References:
CDATA
.NET Language-Integrated Query for XML Data
XCData Class

Monday, January 23, 2012

Check -> Argument -> Condition: .NET 4.0 Code Contract, CuttingEdge.Conditions, Shrinkr Check


An early part of designing an effective, maintainable system is consistency. The more consistent your applications responses, the better it will be in the long term. One place that breeds inconsistency in parameter validation, and exception throwing.

So I set out to find a condition checking library capable of accomplishing a few simple goals:

  • Must provide a simple interface for validating conditions (object and state)
  • Must have a consistent response algorithm or structure
  • Must be written in .NET 
  • Must be open-source

With the additional side-goals (or bonus points):

  • A fluent interface
  • Documentation or code samples
  • NuGet package
There are plenty of libraries and source-code examples out there that accomplish these goals:

  • .NET 4.0 Code Contracts - Code Contracts provide a language-agnostic way to express coding assumptions in .NET programs.. The Common Language Runtime (CLR) team is introducing a library to allow programming with contracts in the Microsoft .NET Framework 4. 
  • Shrinkr's Check -  "a Url Shortening Service which demonstrates some of the best practices in developing real life web applications." and contains a fluent checking class structure.
  • CuttingEdge.Conditions - CuttingEdge.Conditions is a library that helps developers to write pre- and postcondition validations in their C# 3.0 and VB.NET 9.0 code base. 
  • GuardQ - validating function arguments.
  • Fluent Validation - "A small validation library for .NET that uses a fluent interface and lambda expressions for building validation rules for your business objects."
  • FluentValidation for .NET 2.0 - Rework of the popular FluentValidation library to work in .NET 2.0. There is one minor syntax change and the LinqBridge (lambda syntax for 2.0) library is required, so please read the docs. Namespaces and licenses are similar to FluentValidation.
Here are a few:
CuttingEdge.Conditions

CuttingEdge. Condition Validation Extensions
Conditions is an open-source project (available on CodePlex) that "CuttingEdge.Conditions is a library that helps developers to write pre- and postcondition validations". The library supports a wide-variety of conditions (including those great string based IsNotNullOrEmpty). On the right is a screenshot of the condition validation extensions if you want a quick look.

The library provides a consist way to manage condition checking, and even provides a way to pass a formatted string for the exception message:
Condition.Requires(context, "id").IsNotNull("Parameter {0} cannot be a null reference");

Another feature that this library provides is the WithExceptionOnFailure (AlternativeExceptionCondition WithExceptionOnFailure() where TException : Exception) allowing you to define the exception to be raised when the condition fails. While this is not something you would generally want to do, the added flexibility is certainly appreciated.

.NET 4.0 Code Contracts

.NET Code Contracts is part of the 4.0 version of the .NET framework. From MSDN: "Code contracts provide a way to specify preconditions, postconditions, and object invariants in your code. Preconditions are requirements that must be met when entering a method or property. Postconditions describe expectations at the time the method or property code exits. Object invariants describe the expected state for a class that is in a good state."

It seems intuitive, and easy.

The downside? It requires your application run on .NET 4.0. If your already ugpraded, this seems like a great choice. I mean, it's built into the Framework (no external dependencies ++), and generally offers a comparable offering to other open-source implementations designed to provide this functionality.

Compared side-to-side with Conditions, the syntax and code appear very similar:

            Contract -> Contract.Requires(context != null, "Parameter context cannot be a null reference");
            Condition-> Condition.Requires(context, "context").IsNotNull("Parameter {0} cannot be a null reference");


It's slightly more flexible in it's implementation than Condition. Most notably because Condition has a default set of validations, and exceptions, and Contract is wide open.

As an additional bonus, Code Contracts offer Legacy Requires Statements. From their description it is:

Most code already contains some parameter validation in the form of if-then-throw code. The contract tools recognize if-then-throw statements as preconditions when the statements appear first inside a method, and the entire set of such statements is followed by an explicit Contract method call, such as a Requires, Ensures, EnsuresOnThrow, or EndContractBlock.
When if-then-throw statements appear in this form, the contract tools recognize them as legacy-require statements. The EndContractBlock form is used only if no other contracts follow the if-then-throw sequences, but they should still be tagged as legacy-requires.
Shrinkr's Check.Argument
Shrinkr is "a Url Shortening Service which demonstrates some of the best practices in developing real life web applications." and contains a fluent checking class structure. It's simple, clean and provides a way to validate arguments and provide predicable exception handling.

Here's an example of the Check class. I have also seen a version of this where the Check, Argument classes are not static (from another location):
public static class Check
{
    public static class Argument
    {
        public static void IsNotNull(object parameter, 
                                     string parameterName)
        { ... }

        public static void IsNotNullOrEmpty(string parameter, 
                                            string parameterName)
        { ... }

 .... etc ....
}
This provides the basic context that Conditions provides, but is a far simpler implementation. It's also more of a larger project and really out there as a stand-alone library. However, it does meet the goals above, and would be an easy addition to any enterprise's library.

End-Notes:
... I still like Conditions the best, but only slightly. Mostly it's the little things, like formatted messages (so I can extract it to a resource) or the validations already built and ready to go.

... There are plenty of options to choose from. Pick one and start using it. Almost all of these will produce better, more consistent results than role-your-own. If your code does not validate add this and get started.

... This is an easy upgrade to your library that will produce good results in the short, middle and long term.

Resources:
Shrinkr
CuttingEdge.Conditions
CuttingEdge.Conditions (NuGet)
NuGet
Fluent Validation
.NET 4.0 Code Contracts on MSDN Magazine
DevLabs Code Contracts
Fluent Validation
FluentValidation for .NET 2.0
GuardQ

Friday, January 20, 2012

Amazon DynamoDB: First Impressions

An e-mail arrived this morning from Amazon Web Services (Aws):

It was a new feature announcement from Aws.

The new product is DynamoDB... a new NoSQL database engine for Amazon Web Services (Aws)

From the e-mail, it's described as "Amazon DynamoDB, a fully managed NoSQL database service that provides fast and predictable performance with seamless scalability".

It has some interesting key points:
  • Data on Solid State Drives (SSDs)
  • Replicated data synchronously across multiple zones.
  • Supports two types of primary keys: Hash Type and Hash and Range Type 
  • JSON for data transport
This new product is a NoSQL database engine built into their cloud infrastructure (similar to SimpleDB as a product, but with some notable changes and upgrades).

From CTO Werner Vogels' DynamoDB blog post he states the differences between SimpleDB and DynamoDB: "While SimpleDB has been successful and powers the applications of many customers, it has some limitations that customers have consistently asked us to address." Listing: Domain scaling limitations, Predictability of Performance, and SimpleDB's Pricing complexity.


From the documentation, it appears to be similar to their existing  SimpleDB offering, but is designed for large scale needs (and predictable) performance. Most notably no limits or restrictions on database size, and automatic management of scalability across multiple servers.


What is NoSQL? 
According to Wikipedia's  NoSQL article:
In computingNoSQL (sometimes expanded to "not only SQL") is a broad class of database management systems that differ from the classic model of the relational database management system (RDBMS) in some significant ways, most important being they do not use SQL as their query language.
DynamoDB Storage Model (1000 ft view)
From their documentation, each object (or item) is stored within a table, with the individual values within the item stored as attributes:
From Amazon DynamoDB (beta)
Getting Started
In order to get started with DynamoDB, you have to make sure you have a couple of things:
  • Amazon Web Services developer account
  • Your AWS account has to be set up to use Amazon DynamoDB
  • Your AWS Access and Secret Key (in order to run the examples, or start your development project)
DynamoDB and .NET
The AWS SDK for .NET has been updated to contain samples for using DynamoDB with the .NET framework (fair warning if you have a previous version of the SDK you may have to remove it to install this one).

The SDK contains (basically) 4 samples:

  • Data Model Sample - creation of a context, persistence of items using the context, and some queries against the context.
  • Data Model Schema - Samples of objects that can be stored within DynamoDB. Specific would be the use attributes: DynamoDBTable that "Specifies that this object can be stored in DynamoDB". DynamoDBHashKey, DynamoDBProperty, and DynamoDBRangeKey to name a few.
  • Document Model Sample - Demonstrates the document model. Shows creation of a table, and adding documents to a table, persistence, and querying from the model.
  • Table Operations - Basic table operations: Contains, CreateTable, table status, table deletes.

These samples are available for VS 2008 and VS 2010. With my installation, the samples were located at C:\Program Files (x86)\AWS SDK for .NET\Samples\AmazonDynamoDB_Sample.

Pricing (as of 1/19/2012)
Notes About Amazon DynamoDB states it pretty well "pricing is based on actual write/read operations and not API calls (e.g. a query returning 100 results accounts for 100 ops and not 1 op)".


You pay a flat, hourly rate based on the capacity you reserve:
Throughput Capacity
Write Throughput: $0.01 per hour for every 10 units of Write Capacity
Read Throughput: $0.01 per hour for every 50 units of Read Capacity

The listed pricing (as of 1/19/2012) offers a free-pricing tier as well:
Free Tier*
As part of AWS’s Free Usage Tier, AWS customers can get started with Amazon DynamoDB for free. DynamoDB customers get 100 MB of free storage, as well 5 writes/second and 10 reads/second of ongoing throughput capacity.
Final Thoughts:
... What's missing: From CTO Werner Vogels' DynamoDB blog post one comment from Faraz points out one of the key components missing from DynamoDB's initial offering: "missing critical database pillar of snapshot backup and fast recovery system". Vogel's response: "we have a philosophy of launching with a minimal feature set and then quickly iterating while prioritizing based on customer feedback. Backup/Restore for DynamoDB will have high priority". While true, I'm still waiting for micro in VPC!

... "pure local emulation is not available".  However, there is an open-source project to provide this emulation for SimpleDB: fakesdb (a fake version of Amazon's SimpleDB for local/integration testing).

... JSON is used for sending data and for responses, but it is not used as the native storage schema (from
Notes About Amazon DynamoDB).

... DynamoDB vs. Cassandra