June 15, 2009

Rackspace support really is fanatical..

Filed under: General — Tags: — Darrin Maidlow @ 11:21 pm

Friday night I was working on a customers server configuring .  I’m pretty happy now that I recommended they go with Rackspace.  The server had some issues.  I had installed and configured Mapguide Open Source, SQL Server, Image Web Server, FloorView was up and running and all the data was loaded in. 

I was getting some strange IO errors in SQL Server.   dbcc checkdb was telling me “OMG this is bad”, Checkdsk on the drive was saying things like “Windows cannot complete the scan”.  I wandered off to get some more coffee thinking this was going to be a fun night.  When I got back I saw a console message sent from a Rackspace support technician asking me if there was anything he could do to help.  I sent him back my phone number and asked him to give me a call.

After a bit of conversation on the server we agreed they would strip out the guts of the server and replace all the drives, controllers, and motherboard.  I went to bed, and the server was back up and ready for me the next morning.  So, 10:30pm Friday night, Rackspace support is proactively finding my problems and tracking me down.   I now understand what they mean when they call their support

<3 Rackspace.

And I’m sure you’re sitting there thinking “can’t he find something better to do on a Friday night?”, well you now know the answer to that question…

March 10, 2009

Using Automated Build Studio to automate offsite backups

Filed under: General — Tags: — Darrin Maidlow @ 10:33 pm

The importance of an offsite backup of your data is well known.  There are companies out there that do simply that – offsite data storage.  Take for example Iron Mountain’s service.  Depending on the amount of data you have to protect, LiveVault can be  great service.

I’ve been using (ABS) for some time now to ensure consistent, and fast builds of our products.  I’m incredibly pleased with it.   One of the key benefits to ABS is its versatility.  Yes, the primary goals of this product is to automate the software build process – and a lot of specific tools are provided in the product to accomplish this (e.g. working with source control, easy tools to automate compiling etc).  Build systems are really just a collection of tasks, usually very small tasks – but often there are dozens, hundreds or even thousands of these tasks.  Task as small as running a dos command line statement, copying files, making a zip file.  When I noticed that ABS gives you the ability to automate FTP operations, it got me thinking – this “build” tool could be used to automate so many of those little tasks I’ve always meant to ‘just write a small batch file’ for – but never got around to doing…

One of our most critical pieces of data is our database for our source code version control.   No source code, no software company – don’t think I need to say much more.   Vault uses SQL server to store its bits.  Well would you look at that.  ABS has a “backup database” component.  The workflow of what I want to accomplish is quite simple:

  1. 1. Backup and verify SQL server databases
  2. 2. Zip up said backup files
  3. 3. Upload the zip file to an FTP server that cannot burn down or blow up at the same time as our database server.

 

Pretty simple – and ABS will do everything we need – and then some.  So let’s get started.  First, I have to assume you have an SQL server running with databases in it.  I also assume you have an FTP server setup and running.  If you really wanna get fancy – get an FTP server that supports SSL running.  I recommend .

ABS Installed on same machine as SQL ServerAutomated Build Studio SQL Server Backup Operation

If ABS is installed on the same machine as SQL server we can make use of the built in macro operations to backup SQL server.  Start up ABS and create a new macro.  We’re going to start off simple – and maybe we’ll add some cool later.  First we’ll add a database backup operation.  From the left hand Operations “toolbox” click on SQL to expand the operations available there.  Double click on “Back Up Database”.  This will add a database backup action to our macro.  On the right hand side of the screen in the “macro” section, double click on the new macro operation and fill in the needed information to connect to your SQL server.  Be sure to specify the database, destination path and that the type of output is File.  If you have more than one database you want to backup – add another backup operation and repeat the process.  To backup Vault – we need to backup the sgvault and the sgmaster databases.

If you run your macro at this point you will be prompted with a message that “The executable file of the “SQLMaint” tool is not specified.   The SQLMaint executable is installed with your SQL server instance and can be found in the Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Binn folder.  Setup your tools to point to this file.  Note, MSSQL.1 will change depending on the you are running.

But, SQL Server is installed on another machine and that’s the way I like it!

Yup, me too.  The built in SQL Server backup operations depend on the SQLMaint executable to run backup process.  Again, this file is installed with SQL server and I could not get this file to run on machine that did not have SQL Server installed.

Instead what I did was created a batch file/SQL script combo on the machine running SQL server and scheduled a task to execute this batch file. The SQL script looks like this:

   1: BACKUP DATABASE [sgmaster] TO  DISK = N'C:\MSSQLData\backups\sgmaster.bak' WITH FORMAT, INIT,  NAME = N'Full Database Backup', SKIP, NOREWIND, NOUNLOAD,  STATS = 10
   2: GO
   3: BACKUP DATABASE [sgvault] TO  DISK = N'C:\MSSQLData\backups\sgvault.bak' WITH FORMAT, INIT,  NAME = N'Full Database Backup', SKIP, NOREWIND, NOUNLOAD,  STATS = 10
   4: GO

 

A batch file that executes this script is scheduled to run at a fixed interval and simply consists of this:

   1: "C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLCMD.EXE" -S localhost -i "C:\AdminScripts\SQL Server\backup.sql"

 

This batch file on execution creates two files in c:\MSSQLData\backups.  These files will be processed by the next step in the macro.  Note that because no username or password information is being specified here, the user set to run this task must have the appropriate rights in SQL server.  This batch file is executed by adding a windows scheduler task.

Whichever path you chose for creating your backup file.  Go ahead and run the macro item for an SQL backup, or from the SQL server execute your batch file to get a backup made of the database(s).Automated Build Studio Create Zip operation

Building the zip file

The next operation we need can be found in the “Archivers” section on the left.  I’m using Zip for this example.   Add a “Pack Files With zip” operation to the macro and setup the properties.  Define the archive name locally.  Next we’ll need to set the path to the files to zip.  We’ll use UNC paths to access a share on our SQL server.  Depending on the rights of the user executing the macro, the administrative shares on the SQL server can be used or a new share can be created with the appropriate permissions assigned.

I set the zip to use the “Best” compression for the “Pack Level” and told it to “Move to archive” which results in the files being zipped getting deleted.  I also set a password on the zip – but make sure you write that down somewhere =).

Select the new archive item and click the “Run the selected operation” button.  This should create a new zip file for you in the appropriate location.

Uploading the zip

In the toolbox expand FTP and we’re going to add a “Connect to FTP” operation.  Double click this operation in the macro section and setup the properties.  Add your hostname, user name, password.  Set the port if necessary.  If your FTP server is using SSL setup the SSL information.  You can now click the “Run the selected operation” button to test the FTP connection.  If needed add some “Create FTP Directory” and “Set Current FTP Directory” items as needed.  Finally we’ll upload the newly created zip.

Add a new “Upload File(s) to FTP” operation to your macro.   Browse to and select the newly create zip file.

Click the big Green Go button – and your backup should be created (for a local SQL server), a zip files made, and that zip file should be uploaded to your SSL enabled FTP server.

Scheduling the macro runAutomated Build Studio Scheduled Execution

ABS comes with a nice little interface to add scheduled executions.  From the “Tasks” menu choose “Windows Scheduler”.  This will bring up the task scheduler window.  Click Add to add a new task.  Up will come a helpful wizard that will help you setup the task.  Set the various options like name, macro file, execution time and frequency and you’re done.   If your SQL server is on another machine, and you have scheduled that backup to run, you will need to ensure that this task is scheduled to execute after the backup is complete – otherwise your backup could be incomplete.

Summary

Building this macro let us cancel our LiveVault account, saving the company a decent amount of cash per month.  While making use of a tool we needed for our for and , and our dedicated server, we were able to ensure we critical offsite backups being automatically taken care of.

This basic macro should help you get started with ABS to implement your own offsite backup solution – possibly making use of resources you already have in place.  Obviously this macro can be enhance significantly – other resources can be zipped up and backed up.  The ABS scripting and variables can be implemented to append a unique number or a date to the file name which would allow for a range of backups to be stored. 

Enjoy!

Technorati Tags:

February 18, 2009

Geospecialling v2.0 and an introduction to FullCircle

Filed under: General — Tags: , , — Darrin Maidlow @ 1:11 pm

Hello!  Welcome back to Geospecialling, and welcome to the new site!  This new site has been my existence for the past several weeks– but more on that later. 

It’s been a crazy couple of months (since ), and we are getting closer to the end of the craziness.  Let me sum things up, and hopefully gain your forgiveness for the lack of posts for two months. =)  At AU we unveiled , and the reception has been phenomenal.  FullCircle is our newest product built .   So after returning from AU, and getting all that feedback from partners and customers we set to work finishin, testing, and polishing the first release of FullCircle – Standard Edition.

FullCircle has two main functions.  First, it allows users to login and query data from their Oracle, SQL Server, and Access databases.  The tabular information is then sent down as an Excel spreadsheet where users may edit or append data.  This spreadsheet can then uploaded to FullCircle and processed.  The updates/inserts are applied to the appropriate database table.

The second key function of FullCircle is for users.   FullCircle allows Excel forms to be defined and saved.  Users can then download the form templates from the repository, print the forms with Capturx, and fill them out.  When the digital pen is docked, and the hand writing is processed back into Excel – these Excel files can then be processed with FullCircle.  The inserts (or updates if needed) are processed and put into the appropriate database table(s).  No temporary tables are used that require processing after the fact (unless of course, this is what you want to do), no custom coding.

On top of that we’ve been hard at work building this new site, setting up the new blogs, doing all that good stuff.  I can finally see the light at the end of the tunnel…

More soon!

December 4, 2008

Introduction to Topobase API

Filed under: General — Tags: , — Darrin Maidlow @ 9:34 pm

This morning I sat in on the Introduction to Topobase API class at Autodesk University presented by Dongjin Xing.  I’ve made an effort to attend Dongjin’s classes every year for the past several years.  He is a good presenter, and has a damn good handle on things.  If you are a developer type, I recommend you consider his courses.  This one was no exception.

I’ve heard people talking about Topobase for quite some time.  I’ve heard good, and I’ve heard bad.  Today I got my first look at the product.  Topobase is a server product that works with Oracle spatial to facilitate the creation, editing and sharing of spatial data in Oracle.  A client is provided with a full API and users can access data using AutoCAD Map or a web client based on Mapguide Enterprise.

The design of Topobase looks to be well tiered.  Using ADO.NET and OraDirect.net it also supports connection pooling.  They provide a neat VB.NET scripting interface that helps build simple workflows, and data validation rules.

All in all, everything they are putting forward looks quite cool.  Visual studio templates to build Topobase plugins that can be loaded right into the Topobase UI.  It could become a very cool starting point for developers/consultants to start building tools to work on data. But there are a few problems…

Topobase is by no means a new product.  It’s several revisions in since Autodesk aquired it – but it still seems to have some performance problem.  During the presentation this morning, Topobase took over a minute to load.  Once loaded, it took over a minute to load and render a small dataset.  Now, Oracle/Topobase and AutoCAD Map were running within a virtual machine, running on a notebook.  Even still – that’s slow.

Now, I’m no Autodesk insider.  I’m also not a business expert (but I have been learning a thing or two about it over the years).  History does tend to repeat itself.  I look back at GIS Design Server and Vision.  Like Topobase, both of these products were acquired by Autodesk.  Like Topobase, these products were ‘Enterprise Data Store’ type things.  Both these products are now "not actively promoted", and customers are encouraged to migrate to Topobase.  With the Autodesk stock price down – they are going to trim fat.  The Autodesk reseller channel for the most part does not have the skills to use, support, or sell this tool – and it really is a developer/consultant tool.   My fear about Topobase is that if I were to invest the time in learning and developing for Topobase, Autodesk will "retire" the product.  Then that invested time is wasted, but more importantly the relationship with my customers is damaged by leaving them stranded up s**t creek with yet another dead ‘Enterprise Data Store’ thing.

It’s a catch 22.  So its not unreasonable to think that Autodesk needs developers like us to adopt, promote, and sell Topobase to help keep it alive.  I’m interested in hearing your opinions on this subject…

December 3, 2008

SQL Server 2008 Spatial

Filed under: General — Tags: — Darrin Maidlow @ 9:49 am

Well day 4 in Las Vegas for Autodesk University.  Took my first weekend off in a long long time =).  Monday was the ADN Developer Day, sadly its all NDA goodness and I really don’t wanna get a call from the Autodesk lawyers =).

This morning I attended the SQL Server 2008 with AutoCAD Map 3D and Autodesk MapGuide (GS100-3) presented by Orest Halustchak (Technical Architect at Autodesk) and Isaac Kunen (Senior Program Manager at Microsoft).   First it was really nice to bring in the brains from Microsoft for a presentation on SQL server.

This class was especially interesting to me, as I have not yet made time to look at SQL server 2008 in great depths.  It looks like Microsoft done a pretty good job at their first spatial offering.  I’m always hesitant to jump in heads first to a first release.  Hey I waited until Mapguide Enterprise 2009 before I even really considered writing any code for it.  If you are new to spatial databases, or you have a more basic need (no need for more advanced Linear Referencing System or network tracing functionality) SQL server might be right for you.  If you already have SQL server experience, the learning curve going to SQL 2008 spatial is going to be a lot less nasty than starting with Oracle Spatial.

The spatial functionality is available all versions of SQL server, except for the compact edition.  Yes, even the free Express version contains the spatial engine.  This is good news.  Often times when Oracle comes up (even the express edition), people get scared.  Oracle has a scary, difficult stigma associated with.  Spatial abilities with SQL server Express could be very helpful in getting a spatial engine into smaller shops.

Microsoft is pretty good with the small details.  One of the cool features of SQL 2008 is the ability to view the spatial data right in SQL Management Studio.  Note this doesn’t seem to be available in the Express edition of Management Studio. 

AutoCAD Map and Mapguide seem to work pretty well with SQL 2008.  I plan to load some larger data sets into SQL 2008 in the coming weeks to see how it performs in comparison to Oracle spatial.

Here is a tip for planning for AU.  Consider budgeting to eat outside of the Autodesk provided buffets.  =)

November 29, 2008

Time for Autodesk University 2008

Filed under: General — Tags: — Darrin Maidlow @ 8:45 am

I’m on the plane heading to Las Vegas. The time has come for another Autodesk University. Yay for my annual “vacation”. This trip is going to be packed full of meetings, classes, and World of Warcraft =). To all my devoted readers, I must apologize for the one month delay in posts. I’ll be working on getting everyone up to speed on what has been going on in the past month here. I hope to post once per day on cool and awesome things at AU. Hopefully there is something cool and awesome at least once per day.

I also have to add that bringing small children on airplanes is cruel, the US Airways coffee sucks, but the orange juice is great =)

September 3, 2008

System.Runtime.InteropServices.ComException Loading ASP.NET Web Application in Visual Studio 2008

Filed under: General — Tags: , — Darrin Maidlow @ 11:04 pm

Another day, another fun error message.  Thanks to all the fun I’ve been having with – I’ve given up and created a new virtual machine with XP Pro to run Visual Studio in.   So I grab the latest revision from source control and load the solution in Visual Studio.  Boom.  Sytem.Runtime.InteropServices.ComException.  That’s it. No more details.  This is one of those errors where it could be just about anything.  A gives way too many results.

So I’m going to add one more search result on this error message.  After much muckery – I’ve resolved my instance of the problem.  The background is simple.  I have an ASP.NET web application with a few DLL projects and a deployment project.  As stated, I’ve created a new install of Windows XP.  First ensure that you have the add-on installed.  That can also cause this error (in some cases).

The key thing in this case was the lack of IIS on the machine.  On my primary development machine (the one where Oracle is a massive pain) I do have IIS installed, and the last time I ran this project from that machine – Oracle was working OK with the data access hack.  But apparently something has changed on that box and now I’m getting the evil "Oracle client and networking components were not found." error.  So I gave up.

First obvious work around is to install IIS.  I’m sick of messing around today, and just want to work.  So the quicker solution is to enable the built in development web server.  This can be accomplished by right clicking the unloaded project in the Visual Studio solution explorer and choosing "Edit <projectname.whateverlanguateyouuseproj>".  This will bring up the XML view of the project.  Find the ProjectExtensions section of the config and change UseIIS to be False.  Setting this up could also prevent problems when you have a larger, or more dynamic team accessing the project.

   1: <ProjectExtensions>
   2:     <VisualStudio>
   3:       <FlavorProperties GUID="{349c5851-65df-11da-9384-00065b846f21}">
   4:         <WebProjectProperties>
   5:           <UseIIS>False</UseIIS>
   6:           <AutoAssignPort>False</AutoAssignPort>
   7:           <DevelopmentServerPort>4088</DevelopmentServerPort>
   8:           <DevelopmentServerVPath>/webrade</DevelopmentServerVPath>
   9:           <IISUrl>http://localhost/WebRADE32</IISUrl>
  10:           <NTLMAuthentication>False</NTLMAuthentication>
  11:           <UseCustomServer>False</UseCustomServer>
  12:           <CustomServerUrl>
  13:           </CustomServerUrl>
  14:           <SaveServerSettingsInUserFile>False</SaveServerSettingsInUserFile>
  15:         </WebProjectProperties>
  16:       </FlavorProperties>
  17:     </VisualStudio>
  18:   </ProjectExtensions>

 

Reload the project, and it should load now.

P.S. Oracle, please please please release something for Vista x64 and ODAC/ODP.  Even an alpha.  I promise I will test on an x86 machine before I release..

Technorati Tags: ,

August 3, 2008

@MApp – a Developers take on an amazing RF Design and Drafting Application

Filed under: General — Tags: — Darrin Maidlow @ 8:53 pm

My friend and colleague recently wrote an article on the (sometimes known as @MAppDR, ATMapp or ATT_MApp).   A strange twist of events also resulted in me working very briefly with the most "recent" build of the product.  It had been many years since I really saw the application running – so I had a bit of a fresh eye.  This made me a little bit nostalgic, and also quite proud.  We had really made some amazing software when you got right down to it.  Unfortunately, in "internet years" it has sadly become a dinosaur.

Long ago (it seems like a lifetime) I worked for a company called Kanotech.  One of my first real development projects of scale was on the .  @MApp was a powerful extension built for AT&T/Comcast cable.  My first task was to head to Seattle and learn how field technicians would walk the existing RF Plant (fancy word for cabling and RF equipment) and map the existing assets, on paper.  The first task was devise a way to run @MApp on one of the early rugged .  I walked the streets of the greater Seattle area with some of the field technicians and got my first introduction into the world of cable TV networks.  At the end of this trip I had a collection of notes and enough information to design and implement the pen based version of @MApp.  This was the beginning of my life with @MApp.

With the bigger picture in mind, there were many underlying goals for @MAppDR, one of the key needs was standards enforcement (and creation -  Evan created the first solid standard which, though very much evolved is still in use today).  Like many organizations back in the day (and likely even today) the mapping data being generated was garbage.  Each division, each consultant, sometimes each drafter had their own set of "Standards".  As Evan put it, the mapping data was "Only good for printing" – and even that was a stretch in some of the sample data we saw.  Our top priority was to create a set of tools which facilitated drafting standards compliant drawings – as well as a method to test each drawing to ensure the drawing was truly compliant.

Under the hood, @MAppDR was all database driven.  Layers, cable types and properties, even symbols and block attributes could be configured in the database.  Generic lisp calls were defined to allow simple wrapper functions to be created to add new entities to the application.  The application had been created using an extensive amount of , VB6, , Python, and SQL Server/Access databases.  Let me tell you, I still have nightmares about parentheses =).@MApp DR drafting tools

The drafting tools, in retrospect were good.  Damn good.  Obvious elements were there, select a cable, or equipment type using standard AutoCAD toolbar buttons.  @MAppDR went a step further.  It would create/set the layer, set the desired object snaps, draft the cable.  Network connected equipment would snap to the cable, rotate, trim and physically connect itself to the model.  In some circumstances, inserting a block/cable of a certain type also required an accompanying block.  If required, the user would be prompted to insert these as well.  In some cases – these blocks were placed automatically.  Using @MAppDR – a user could draft very clean RF drawings without having to really know  AutoCAD.  Every piece of possible equipment was available from a comprehensive set of toolbars – and only one click away. 

On insertion of blocks, the user was presented with a form powered by the attribute information stored in the databases.  Some attributes were required, some were populated programmatically based on nearby objects or other conditions, and some were selected from lookups or manually entered by the user. 

@MAppDR had a really cool connectivity process.  This process would perform physical connectivity on the network, following rules defined for equipment inputs and outputs.  In addition to this, rules were also defined to dictate what equipment could connect to other equipment.  This connectivity, in my opinion was one of the most powerful components of @MAppDR.  Connectivity information was stored on the entities.  When a trace was done, we could follow the network and do all kinds of great reporting.  I recall one situation where we knew that in situations where a specific combination of equipment was setup – it would cause service problems.  Within a few hours, I had a batch process created that would process hundreds of drawings and spit out a report of nodes, and locations where this combination existed.  The problems in the field were fixed even before customer complaints came in. 

One of the key requirements for @MAppDR was a quality assurance process.  We defined several "levels" of QA.  Each level of QA required a password to execute.  The first two levels of QA were for contractors doing mapping redraft, or drafting.  These levels would let the consultants know that the required data had been entered and would "stamp" the drawing.  The drawings were then submitted, and Comcast staff would run their password protected version of the QA routines – which would verify that indeed the contractor had done the required work.  If not, the drawing was rejected and the contractor had to fix it.  I can say with a great deal of certainty that Comcast, using @MAppDR is one of the few organizations with near perfect data.  And the things they do with that data are incredible.@MAppRF Drafting Enhancements

The final version of @MApp, @MappRF could have dominated the cable industry.  Sadly, it was never completed.  This release added RF design to the drafting.  A library of equipment was created, as well as design parameters which were sub-selection of the equipment that let the RF designer know what equipment was available to be used.  All of the equipment’s operating parameters were captured.  As the user drafted equipment, the actual equipment model would be selected on the attributes.  This allowed us to increase the efficiency of populating attributes during the drafting phase all the while making the stored data even more accurate and complete.   

One of the coolest bits of @MAppRF was what came to be known as the TRID.  The TRID was a  multi-threaded C++ control was created that contained a tree-grid hybrid.  This form could be docked within AutoCAD or floating on a separate monitor.  As the user drafted, the RF signal and power calculations were performed in real-time, without interfering with the drafting process.  I recall being told, this project would be simple.  RF design is just table math.  And it is, but, mix in a multi-threaded C++ form running per entity calculations on a potentially infinite number of @MAppRF Design Diaglog - the TRIDfrequencies both going forward, and backwards – oh and sprinkle in a little bit of power draw calculations.  Then tell me its simple.  =)

If I could do it over again.  Wow, what a difference the current technology would make.  By far the single largest flaw @MApp has is its dependence on drawing based storage.  Given the timeline, we didn’t have a choice.  was available, but immature.  Going with Oracle at that time would have likely been more grief than it was worth.   Seamless map access would be a must.  Given the times we used a special piece of ObjectARX code called the SPE.  This gave us spatial analysis abilities that rival even Oracle Spatial.  One of the major downfalls here is that it required AutoCAD to run.  Therefore – most of the calculations had to occur within AutoCAD itself.  This became a problem as other applications could benefit from these operations – but couldn’t take advantage of them without also being AutoCAD based. 

Leveraging a multi-tiered architecture, a lot of the common functions could be moved to a more like architecture.  Now, though the TRID could still be a multi-threaded form in AutoCAD – the calculations could be done on the server side – thereby making the AutoCAD portion a presentation layer – instead of a business logic layer.  Now the RF design calculations could be done in any user interface – instead of having to opening a drawing in AutoCAD. 

And that my dear reader, concludes my epic tale on @MApp.  Even this large novel like piece only scratched the surface of this project.  It was my life for many years.  I slept under my desk a number of times – but it was a great project.  It was fun to think about it again and put together something to share with you. =)

Technorati Tags: ,,

June 13, 2008

Upgrading SourceGear Vault Server to new x64 release on Windows 2003 x64/IIS6

Filed under: General — Tags: — Darrin Maidlow @ 11:11 pm

recently released a 64 bit native built of the server.  I’m on a real x64 kick lately and really enjoy not seeing *32 beside my processes in task manager.  Vault is one of the best source control providers out there, and you cannot beat the price either.  It is core to my professional life.  Next to Visual Studio – it is one of the most important pieces of software I use. So, did a quick backup of my databases, un-installed the old server and installed the shiny new x64 code.  Problems!

First off, my server was running IIS in 32 bit mode.  This was required to run the previous releases of Vault.  Once the install was complete, I started a dos window and set IIS back to 64 bit:

cscript.exe C:\Inetpub\AdminScripts\adsutil.vbs set W3SVC/AppPools/Enable32BitAppOnWin64 0

Then I ran an iisreset.

When I fired up my browser to check the vault service, there was a problem.  All it would display was "Service Unavailable".  At this point, even html files were not being served out.  Did a search on the Sourcegear forums and couldn’t find anything.  It’s been a long week and I was not firing on all cylinders – so I called up the support team.  (Good thing I renewed my maintenance, oh, this morning =D).  At this point Beth from SourceGear and I brainstormed through the situation and came to the following conclusions:

I was the first customer to call with x64 problems.  Yay for being first!

In IIS Manager, the application pool was disabled.  A check of the event logs showed the following information:

Source: W3SVC-WP

Event ID: 2268

Could not load all ISAPI filters for site/service.  Therefore startup aborted.

This prompted a check the web service extensions.  Sure enough, there was a web service extension there configured for ASP.NET pointing to the 32 Bit assemblies.  I prohibited this extension, and added a new one pointing to the 64 bit aspnet_isapi.dll.  Re-enable the application pool and load a page in the browser – still nothing.

Finally – the last step needed to get everything serving properly was to run the following from the x64 framework folder:

c:\windows\microsoft.net\framework64\v2.0.50727\aspnet_regiis -I -enable

So in summary the following steps should get your Vault server upgraded and running in native x64.  Bear in mind, my server is ONLY running Vault and these steps will break ASP.NET 1.1 applications (and lower) and possibly other code you might have running on the server.

   1: Backup SGVault and SGMaster databases (did I even need to include this?)
   2: Un-install the 32 bit Vault server
   3: Install the 64 bit Vault server
   4: Run cscript.exe C:\Inetpub\AdminScripts\adsutil.vbs set W3SVC/AppPools/Enable32BitAppOnWin64 0
   5: Run c:\windows\microsoft.net\framework64\v2.0.50727\aspnet_regiis -I -enable
   6: Start IIS Manager. 
   7: Double check the vault virtual directories to ensure the ASP.NET version is set to v2.0
   8: Prohibit the ASP.Net 32 bit isapi web service extension
   9: Add the ASP.Net 64 bit isapi web service extension
  10: run IISReset.exe

 

mmmm x64 goodness.  Thanks again to Beth for helping me work through this =)

May 30, 2008

Implementing Remotesoft .NET Protector using MSBuild

Filed under: General — Tags: — Darrin Maidlow @ 4:02 pm

RADE has grown significantly in size and complexity over the past four years.  What started off as a relatively simple classic ASP application has grown to 8+ .NET assemblies, with numerous 3rd party DLL references.  Current R&D is going to further increase the size of the build. In addition to that, we’ve developed several vertical products on top of RADE which need to be updated as new revisions of the base framework are completed.

It’s come to the point where I need to get to a one step build.  The first move in here was to implement in my build process as protecting the assemblies ended up being one of the bigger pains in the butt when building.

So to kick things off, we need to work with MSBuild a little bit.  Originally, I tried using the <exec> call from MSBuild.  This didn’t give me the flexibility to loop through the files being generated.  So I started writing a custom build task.  Please read this on building custom tasks if you are new to this.

In summary, I defined a number of get/set methods for the globals I wanted the build engine to set, and in the Execute function I set it up to loop through the passed .DLL files, and execute Protector on each.  After the dlls were processed, they were moved out of the protected folder and the protected folder was removed.  If you are having problems getting your task running, check out this article on .

RADE uses a Visual Studio 2008 to deploy all of the files on build, so to implement the new task we need to do some editing in the project.  Open the project either with a text editor, or in Visual Studio by a right click on the project and choosing "Open Project File".  This part is quite simple.  First ensure that the assembly generated by building your task is in the same folder as the web deployment .wdproj file.  Next we need to add a line near the top of the web deployment project:

   1: <Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="3.5">
   2:     <UsingTask TaskName="Landor.Deploy.BuildTasks.RemoteSoftProtector" AssemblyFile="Landor.Deploy.Buildtasks.dll"/>
   3:     <PropertyGroup>...

Here we point the UsingTask call to both the namespace and class of our protector code, as well assembly.  One more change to make.  Scroll to the end of the project and you should see a number of empty <Target> tags.  We need to add some code to the Name="Afterbuild" tag.

   1: <Target Name="AfterBuild">
   2:         <ItemGroup>
   3:             <DLLFiles Include="$(MSBuildProjectDirectory)\Release\Bin\*.dll"/>
   4:         </ItemGroup>
   5:         <RemoteSoftProtector
   6:             Files="@(DLLFiles)"
   7:             ProtectorEXEPath="C:\Program Files (x86)\Remotesoft\Protector\bin\protector.exe"
   8:             ProtectorParams="-neutral -string -cctor -clrversion v2.0.50727"
   9:             BinFolder="$(MSBuildProjectDirectory)\Release\Bin\"
  10:             Exclusions="AjaxControlToolkit.dll;Microsoft.Xml.Schema.Linq.dll;ZedGraph.Web.dll"
  11:         />
  12: </Target>

 

Two things occur here.  First, in the <Itemgroup> tag we are initializing a variable called DLLFiles, and it’s getting all the .DLL files in the project’s Release\Bin build folder.   Note that this process creates a semi-colon delimited list of full paths and files.

The next thing that occurs is actually calling the build task using the <RemoteSoftProtector> tag.  The tag name should/must match the name of the build tasks’ class.  Within this tag, we are setting all of the defined public properties, using the same name as those defined with our build task class.

This concludes a day of fun learning how MSBuild and custom tasks work.  Hopefully it helps you out a bit.  Bugs or comments, let me know.

Technorati Tags: ,,

« Newer PostsOlder Posts »

Powered by WordPress

Switch to our mobile site