Monday, July 31, 2006

Postal address required post XP SP2 ?

One wonders at their ability to last long term. Their business model is fundamentally challenged as is. In this day and age, a licensed and paying software customer should not have to rely on the US postal service to get software components that they own a license for.

Case in point, I have a Windows workstation (who doesn't), which runs a full cost client license for xp, one of the $300+ babies, not an OEM license. It runs fine, is serviceable, rarely crashes, etc. All the things you expect from a modern OS. Recently I required IIS server on it, as my old IIS dev server finally bit the big one.

No problem right? Just install IIS from the XP CD. Well here is the interesting problem which appears to require the US mail to address.

When I added my update to SP2 via the Windows update function I made my original CD essentially useless. I am unable to install IIS from the XP SP1 disk, as it is looking for newer SP2 files. I did some searching on Google and found I am not the first to encounter this. There is the SP2 update available via the MS Web Site.

I also found the ability to fix this issue (from Microsoft). IF you have both the SP1 CD AND the SP2 CD. (Just how many people is that in the world right now? I'm may be going out on a limb but the number is a very very small % or SP2 users). Here is where Microsoft's model is breaking, to get a copy of the SP2 update CD, I have to order one, pay for it to be delivered, and wait for the US post.

Since the client I am doing this for would laugh at the suggestion that we put the project on hold for a week while I wait for software to be delivered via the mail, I kept searching.

I then found some instructions on how to slipstream the downloaded copy of the SP2 patch into the SP1 files, and burn your own CD. This seemed far more reasonable. Although as you notice this is not a Microsoft KB article, so while this is possible, it is not something Microsoft wants you to know about.

After a long time, I ended up with a directory that for all intents and purposes was an XP SP2 CD. Good news right ? Wrong. The IIS installation routine would not accept the resulting directory which was XP with SP2 merged in via update.

Ugghhh, now remember, this is to install a software component that is part of the operating system I paid for, and paid good money for. Had this been open source software I could have downloaded the latest CD free of charge via the web, and would probably have found a troff of documentation outlining the process I would need to follow.

Microsoft does not make this type of download service available for paying customers because their software might be stolen, iow they do not trust their customers. Well who could blame them? But this is the heart of the problem they face in their next 5 years. How can you bill yourself as a business partner to someone it is clear you don't trust from your very basic interactions with?

Ultimately it turned out that there is an update to the security database that you can make to fix this issue. I could find no KB article from Microsoft on this method. In fact they do mention this tool, in their documentation on troubleshooting, but not with the /p switch which seemed to fix the issue.

This brings me to another challenge they face. Because the source is closed, their are a finite number of knowledgeable people to investigate and work on issues. This is one of the reasons that even after you pay $300+ for the program that makes your PC work (the OS), you have to pay another $250 to ask a question about why something doesn't work. This would be money you would pay in addition to your time waiting on hold, in a support queue, and explaining your issue to multiple helpdesk personnel. Many time they have solid 'English as a second language' skills, and very few technical skills other than searching the same knowledgebase you have access to.

Add to this they continue their monopoly on technical information about their closed product via their web site. Some portions of which don't work well in browsers from other companies, so if you use a Mac or Linux machine to search for answers you may find the articles which might contain the answers will not render your browser.....

All of this points to inefficiencies in their overall processes. Inefficiency raise costs, and ultimately these will be the costs that sink the ship as it is built today.

I wish Mr. Ozzie good luck with these challenges.


Thursday, February 09, 2006

Hair Loss and Extensions (pt 2)

(or how I learned to plug FrontPage Extensions into a Plesk subdomain)

-Part 2 (Read Part 1... The Madness)

How to add Frontpage extensions to a Plesk subdomain. This assumes you have the following created in Plesk:

1)Plesk domain, with Frontpage extensions enabled
2)One or more subdomains created in Plesk for the primary domain

Technical Details
Frontpage extensions are used by Microsoft applications to communicate, read and publish data to web sites. Details about them can be found here. It basically consists of:

Basic Frontpage environment:
1)Frontpage executables and configuration files, /usr/local/frontpage/
2)Web site directories, files, and .htaccess controls

If Frontpage is installed on the root directory then it is configured for the server, you just need to enable it for the individual domains. Plesk does it with their config utilities websrvmgr for the main domains, but their syntax does not seem to account for subdomains,

Here is the recipe:

1)Create a dummy domain (dd) in Plesk, and turn on Frontpage extensions, use the user name and password you want to use for your subdomain

2)change directories to the subdomain httpdocs directory
  • cd /home/httpd/vhosts/domain/subdomains/subdomain/httpdocs/
3)copy the following from the dummy domain:
  • cp -R /home/httpd/vhosts/(dd)/httpdocs/_vt* .
  • cp /home/httpd/vhosts/(dd)/httpdocs/_private/.htaccess private/.htaccess
  • cp -R /home/httpd/vhosts/(dd)/httpdocs/.htaccess .
4)Open the .htaccess file and edit the three lines for your environment:
  • AuthName (subdomain.domain)
  • AuthUserFile (path to the user password file) /home/httpd/vhosts/domain/subdomains/httpdocs/_vti_pvt/service.pwd
  • AuthGroupFile (path to group membership file) /home/httpd/vhosts/domain/subdomains/subdomain/httpdocs/_vti_pvt/service.grp
5)Save this file and copy these three lines. You will need to replace them in a number of starting .htaccess files.

6)Edit each of the .htaccess files. You can find them to make sure you don’ miss any
  • find . -name ‘.htaccess’
7)Edit each of these and replace the AuthName, AuthUserFile, and AuthGroupFiles with the proper entries you copies from when you edited the main .htaccess files.

Don’t change anything else in these files


8)Edit ‘_vti_pvt/access.cnf’ set the Realm and PasswordFile entries appropriately
  • Realm is subdomain.domain.com
  • PasswordFile is: /home/httpd/vhosts/domain/subdomains/subdomain/httpdocs/_vti_pvt/service.pwd
9)Change to the ‘_vti_pvt/ directory

10) chmod go+rw frontpg.lck

11)chown user:root service.cnf
(user is Plesk ftp username)

12)check all file ownership permissions, for all the files mentioned they should be owned by
user:psaserv
Once all these changes have been made the subdomain site has the files necessary, we just need to let apache and Frontpage know whats up.

In /usr/local/frontpage there are files for the parent domain, one in the conf directory and one in the root, we will need a pair of these for each subdomain we created. Copy the files and then we can edit them.

cp /usr/local/frontpage/domain:80.cnf /usr/local/frontpage/subdomain:80.cnf

cp /usr/local/frontpage/conf/domain.fp.80.cnf /usr/local/frontpage/subdomain.fp.80.cnf

The conf file is a httpd conf file excerpt. Edit it to make sense. ServerName, User, DocumentRoot and Directory should be altered appropriately

the :80.cnf file seems to be a pointer to the conf file. And if you notice primary domains (which also support www alias) have a linked files for the www. name. For subdomains you don’t need this.

The last change is to the httpd.include file.
In
/home/httpd/vhosts/domain/conf/httpd.include
you need to find the Directory section and duplicate it for the subdomain entry

You need to make a backup copy of this httpd.include file because it will be over written by Plesk. The other changes should not though.

So if you build the monster you need to automate the replacement of this file and restart of the httpd server.

Thursday, February 02, 2006

Hair Loss and Extensions

(or how I learned to plug Frontpage Extensions into a Plesk subdomain)

- Part 1



We use Plesk 7 to manage out hosting servers. I like its interface, it is easy to use, and I find my clients have few problems navigating it.



I recently had a new client with a primary domain and a few subdomains utilize our hosting services. It was our first hosting contract which involved subdomains and a large population of publishers using Microsoft products, including Frontpage, Word, and Excel to keep portions of their site updated.



During deployment of the site I realized that Plesk 7 doesn't support Frontpage extensions for subdomains. This was a disappointment, and we worked to try and enable functionality of the Microsoft Products using FTP as a methodology for file transfer. As usual we found that Microsoft supports their own methods better than standards and it became apparent that implementation of Frontpage extensions on the subdomains would make the editing, updating and management of the site easer.


I spent a bunch of time reading up on the Frontpage extensions, and especially as they related to Plesk, and specifically Plesk 7. There is not a lot of detail. I found a great Frontpage support resource, and details on installing from scratch on a Linux/Apache server. Unfortunately that is not exactly what I needed. Plesk does some strange things, like has its own suexec etc. So we needed to fit into that system.


Further research yielded a way to manually clean an installation that is not behaving, and this lead me to realize that the extensions are not all that foreign, They are just a bunch of directories, .htaccess, executable and configuration files. I created a blank test domain and traced the files and directories created and came up with a recipe that worked for me.


Caveats/Warnings:

  1. If you edit the web site settings in Plesk after you make these changes, Plesk will overwrite your changes to the domain httpd.include file

  2. If you reboot the server Plesk will overwrite the changes to the domain httpd.include file

  3. File ownership and permissions are critical, pay careful attention

  4. I did not spend the time to script recreation of the environment in the event of corruption

  5. If you follow these instructions in a production environment without first testing them you deserve the lynching you get, and don't blame me.

Considerations:

  1. You may want to remove FTP access after this because all management can be accomplished through Frontpage, and ftp users can corrupt files. To do this clear the FTP password in Plesk

  2. Be sure and test your changes and familiarize yourself with the files and structures before going to a live environment

Ok enough gas bagging..... Part 2 will be coming up next...

Saturday, January 28, 2006

Dark Fiber, Sonicwall, PPOE and Macs

Recently a client of mine had the opportunity to get a Fiber Optic connection to his office. Verizon in many areas is making this service, called FiOS available. They certainly can provide a lot of services with this type of bandwidth. Phone, internet and video are their current focus it seems.

Anyway, we cut over the internet connection, which in this case is handled by a Sonicwall TZ170. It was connected to a cable modem using DHCP and running NAT and we switched to a PPOE connection with credentials again NAT'ing. We set the MTU to 1490 and told the Sonicwall to fragment outgoing packets as necessary. Checked connection from the windows server, worked great, desktops great. And I thought we were done, another routine day.

Well you know how that goes, anytime it seems too easy there is always a tiger lurking.

This time is was the Mac's. Yep the Tiger OS was waiting to pounce. It seems that the Macs connected, and could ping hosts, but web sites and mail were not working. Actually web sites with little content like say Google worked fine, but cnn.com or apple.com did not. The bad sites would start to load, but never finish, usually hanging on large graphics....

Turns out the Macintosh machines Ethernet interfaces (both airport and Ethernet) are set to a mtu of 1500 and for some reason the Sonicwall was not able to fragment the packets on the way out. Changing the mtu to 1490 with ifconfig:
ifconfig en1 mtu


This seemed strange and inconvenient. And while I have a familiarity with Macintosh machines of all types, I have not spent much time with OSX. I can survive the experiences because of all the years of linux/unix. I tried to get the machines to set their interfaces via an rc.local type of approach, but it wasn't working like BSD the way I remember it. So I visit Apple with one of the windows machines and find a super helpful doc that outlines how to set the mtu via a Startup Script. In my opinion they have the startup methodology worked out well, it's very organized, and seems very reliable.

As for the issue with the Sonicwall, I've notified Sonicwall of the issue and they are looking into it.

I have a sneaky suspicion that the Windows machines were not using an mtu of 1500, I swear I remember reading an article about windows mtu somewhere that mentioned windows has a low default frame size. I was unable to find this document tho. If you know I'd love to hear from you.

Anyway, it won't bee too long before my next post.

Till Next Time,

Sean Riley
President, DogRiley

Thursday, October 20, 2005

FileZilla and Windows FTP

Well, in the Windows world, graphical FTP clients were often a sore spot. Good ones cost money (ala WS-FTP etc.) , and free ones that were useful were elusive.

Look no further. FileZilla is here. It's a sourceforge project, and I have been using it for about the last month. It is awesome. It's easy to setup, and use, and has all the features you need in a ftp client.

If you are looking for ftp on windows, check it out.

Now Microsoft Windows has 'Network Places' that can be used for some ftp duties, but I have found this to be problematic especially with large transfers, and it's lack of resuming transfers. To be fair to Windows, the Mac still lacks solid ftp support. For our Mac friends tho CyberDuck has proved to be a solid client and they may want to check it out.

Till Next Time,

Sean Riley
President, DogRiley


Thursday, September 29, 2005

Remote Web Workplace error

Boy it has been a while since my last update. Vacations in August and busy beginning to September can do that.

Ran into an issue with a client, connecting remotely with Remote Web Workplace, getting a strange error. When connecting to the server all would go fine until he chose a machine to connect to and clicked connect. At that point a strange error would pop up preventing him from connecting.

Line: 272
Char: 4
Error: Invalid procedure call or argument
Code: 0
URL: https://fqdn.com/Remote/tsweb.aspx?
After some looking I found that the issue is related to the screen display size. If you go into the options drop down and choose 1024x768 as the screen resolution the problem resolves itself. I will have to look and see why this issue affected the user in this particular situation and not in other locations.

SmallBizServer.Net gets the shout for the assit in the problem resolution.

Till Next Time,

Sean Riley
President, DogRiley




Tuesday, August 23, 2005

sqlservr.exe high memory usage for SBSMonitoring process

This seems to be a common issue with my SBS2003 installs. The administrative user will get an E-mail entitled: Allocated Memory Alert on 'ServerName' which directs you to the process table to find the culprit. In most of my cases I see two hogs. store.exe (Exchange) and sqlservr.exe. We will cover the issues surrounding store.exe in a later post, but for know here are my finding for the sqlservr.exe process. I found the fix in this thread on winserverhelp.com.

First verify that the sqlservr process at fault is in fact the SBSMonitoring one. First open Windows Task Manager, choose the processes tab, click the view menu and choose Select Columns. Check the PID column. This will show you the process ID of the entries in the process table. Sort this list by Mem Usage column and get the PID of the offending sqlservr.exe process. The to confirm the PID is SBSMonitoring at a command prompt type:
tasklist /svc
Look for the PID that matches your memory hog. It will look something like this. My PID was 376:

sqlservr.exe 376 MSSQL$SBSMONITORING

If this is the case, you can alter the memory settings for SBSMonitoring process with the following commands from the command line. (Thanks to David Copland)

osql -E -S servername\sbsmonitoring
sp_configure 'show advanced options',1
reconfigure with override
go
sp_configure 'max server memory', nnnn
reconfigure with override
go

My session looked like this:

C:\Documents and Settings\Administrator>osql -E -S file-server\sbsmonitoring
1> sp_configure 'show advanced options',1
2> reconfigure with override
3> go
DBCC execution completed. If DBCC printed error messages, contact your system
administrator.
Configuration option 'show advanced options' changed from 0 to 1. Run the
RECONFIGURE statement to install.
1> sp_configure 'max server memory',100
2> reconfigure with override
3> go
DBCC execution completed. If DBCC printed error messages, contact your system
administrator.
Configuration option 'max server memory (MB)' changed from 2147483647 to 100.
Run the RECONFIGURE statement to install.
1> quit



I set my server to use a maximum of 100MB of memory, and so far all seems well with it. I am planning on adding this to the normal maintenance cycle for all of our supported SBS2003 servers.

Till Next Time,

Sean Riley
President, DogRiley