Showing posts with label server. Show all posts
Showing posts with label server. Show all posts

Apr 5, 2010

How to add a new domain to all email addresses in Active Directory

So recently one of our System Administrators had been tasked with updating all of our users to a new domain that we bought. He then though, “How do I add a new domain to all email addresses in Active Directory using PowerShell?” This is when he though, who better to help me do some scripting but, [queue heroic music] ‘FreedomChicken’! The ever friendly developer who with his powers of Debugging and Google Searching can single handily defeat evil Bugs and Hang-ups in applications and scripts. Now with Freedom chicken on the job the two were able to defeat this evil foe in a quick and timely manner. The result of this victory will be as follows.

PowerShell Script:

get-mailbox %username% | foreach {

[string]$smtpaddress = Get-mailbox $_ | select primarysmtpaddress

$addressarray = $smtpaddress.split("@")

$addressarray1 = $addressarray[1].split("=")

[string]$newmail=$addressarray1[1]+="@NEWDOMAIN.com"

$_.EmailAddresses+=$newmail

$_

} |Set-Mailbox


Apr 2, 2010

Setting up Pivotal 6 on SharePoint 2007 Standard

According to Pivotal, setting up Pivotal 6 on SharePoint 2007 Standard should work out of the box. Well I guess if by out of the box you mean having to go into your SharePoint server and run a script and edit some XML files. Then yes out of the box. Well we are currently working on setting up our new Pivtoal 6 environment so we can FINALLY get rid of IE6!!! That will be one of the best days of my life when we can finally get rid of that virus magnet they call a browser. But that’s besides the point. This post is a quick how-to fix an issue when you’re attempting to create a new pivotal site in SharePoint 2007.

If you are running into this error when attempting to create a new Pivotal Site collection on SharePoint:

Feature '2510d73f-7109-4ccc-8a1c-314894deeb3a' is not installed in this farm, and can not be added to this scope.

The fix is as follows:

1. To register the ReportListTemplate feature on the SharePoint server. Runn the following in the command prompt

pushd %programfiles%\common files\microsoft shared\web server extensions\12\bin

stsadm -o installfeature -filename ReportListTemplate\feature.xml -force
stsadm -o activatefeature -filename ReportListTemplate\feature.xml -url TheURLofYourAdminSite

pause


2. Comment out the following lines from the onet.xml files

<!--<Feature ID="D250636F-0A26-4019-8425-A5232D592C09" />
<Feature ID="00BFEA71-DBD7-4F72-B8CB-DA7AC0440130" />
<Feature ID="065C78BE-5231-477e-A972-14177CC5B3C7" />—>

Which can be located at the following two locations:
“C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\TEMPLATE\SiteTemplates\PivotalSmartClientPortalBlankSite2\xml\”

and

“C:\Program Files\Common Files\Microsoft Shared\web server extensions\12"\TEMPLATE\SiteTemplates\PivotalSmartClientPortalTeamSite2\xml”


This way now the Pivotal sites will not try to load features that are not installed in the standard version.

By FreedomChicken

Sep 10, 2009

Clean Up Stale Records in DNS

One of the many many tasks I was asked to lift off the shoulders of my company’s other Sr. Systems Engineer was to clean up DNS in the many domains we administer. He apparently took over an aging network that needs some serious TLC.

Since he is overloaded with many other projects, I gladly took that one off his plate. If you ever find your self having to do a similar project, it is actually pretty easy to do. You see Microsoft has built a feature into it’s DNS server that does an auto cleanup of stale records in DNS, the problem is they don’t turn it on by default. A lot of Systems Administrators overlook this feature when initially setting up DNS on their networks, and over time it can create problems, and a messy DNS structure.

The feature is called Aging and Scavenging. If this feature is never enabled you may encounter the following problems:

  • If a large number of stale resource records remain in server zones, they can eventually take up server disk space and cause unnecessarily long zone transfers.
  • DNS servers loading zones with stale RRs might use outdated information to answer client queries, potentially causing the clients to experience name resolution problems on the network.
  • The accumulation of stale RRs at the DNS server can degrade its performance and responsiveness.
  • In some cases, the presence of a stale RR in a zone could prevent a DNS domain name from being used by another computer or host device.

aging scavenging dns To enable Aging and Scavenging on all DNS zones do the following:

  1. Open the DNS snap-in.

  2. In the console tree, right-click the applicable Domain Name System (DNS) server, and then click Set Aging/Scavenging for All Zones.

  3. Select the Scavenge stale resource records check box.

  4. Modify other aging and scavenging properties as needed


[Via Technet]

You can also simply make the change on individual zones as well, to do that:

  1. Open the DNS snap-in.

  2. In the console tree, right-click the applicable zone, and then click Properties.

  3. On the General tab, click Aging.

  4. Select the Scavenge stale resource records check box.

  5. Modify other aging and scavenging properties as needed.

[Via Technet]

So why isn't this enabled by default? According to Technet, there are some risks here:

By default, the aging and scavenging mechanism for the DNS Server service is disabled. It should only be enabled when all parameters are fully understood. Otherwise, the server could be accidentally configured to delete records that should not be deleted. If a record is accidentally deleted, not only will users fail to resolve queries for that record, but any user can create the record and take ownership of it, even on zones configured for secure dynamic update.

Still though, I think the good outweighs the bad with this feature. What do you think though? have you ever had problems with this feature? Do you know of some other good tools to use with DNS/AD cleanup? Let me know in the comments!

May 19, 2009

Migrate Printers From Windows 2003 to Windows 2008 Server

At my day job we are currently in the mist of migrating our Windows 2003 environment slowly into a Windows 2008 environment. One of the first places we are moving to 2008 is on our file and print servers in the field. Lucky me, I got to configure the first Windows 2008 file and print server in one of our nearby remote offices!

One of the first things I needed to do after installing the server, and powering it on was to migrate all of the printers from the old 2003 server in that office over to the new 2008 server that was replacing it. In Windows 200 and Windows 2003 I would have used the Printmig utility, but that has gone by the wayside in 2008. Now, it is even easier to migrate printers!

So how about it? Do you want to know how to do it? Easy!

  1. Click on Server Manager
  2. Scroll down to Roles Summary and click on Add Roles
  3. Check the box for Print Services and follow the wizard to install
  4. After it is installed, Click Start > Administrative Tools > Print Management
  5. Expand the tree, then right click on Print Servers and click on Add/Remove Servers
  6. Enter the name of the print server you want to migrate the printers off of and select Add to List, then click Apply
  7. Right Click on that Printer in the tree and select Export Printers to a File and save that file on your desktop
  8. After the export is complete, right click on your new server in the tree
  9. Select Import Printer from a File, and select the export from your desktop
  10. Follow the wizard and you are done!
It couldn't be more easy! Now the old Printmig was similar to that, but this feature is built right in! No downloading, no fussing! Just a simple export, import and boom! DONE!

One thing you have to note though is that, according to Technet, you cannot migrate printers from 2000 servers or older. For that you will have to use Printmig to a Windows 2003 server first, then on over to a Windows 2008 server.

How many of you have migrations like this coming up? How many of you have already migrated? Let us know in the comments!

Technorati Tags: , , , , ,


Apr 20, 2009

Free Secure Instant Messaging For Your Business!

The company I work for has a no instant messaging policy. More precisely, the policy says we are not allowed to use outside instant messaging services like AIM, MSN, GTalk etc. That makes sense right? I mean there is a very real security concern with using public messaging services for business use.

It turns out that our Finance department has been begging and pleading my boss for the use of IM for collaboration between their team, but he has been holding them at bay because of our IM policy. That is until I saw a recent episode of Hak5. In Episode 508, Matt Lestock introduced everyone to Openfire created by Ignite Realtime which according to their site is:

…a real time collaboration (RTC) server licensed under the Open Source GPL. It uses the only widely adopted open protocol for instant messaging, XMPP (also called Jabber). Openfire is incredibly easy to setup and administer, but offers rock-solid security and performance.

Here is the segment of Matt Lestock talking about Openfire on Hak5:

You can catch the full episode here: (Hak5 Episode 508)

As you can see with the video it is really easy to setup. In the video, Matt uses the embedded database. I played with that myself, and I have to say it sucks! I set mine up with MySQL and it runs much better. Also, he set his up on CentOS without Active Directory support. I on the other hand set mine up on 64 bit Ubuntu 9.04 with LDAP support so my company’s users can all login without me having to setup 300 or so user accounts!

Installation is incredibly easy in Ubuntu 9.04. Especially if you are installing it with a brand new installation of Ubuntu server, which is what I did because you need to have Ubuntu setup as a LAMP server. As many of you already know, you can select the option of setting up a LAMP server in Ubuntu server at install time by simply checking the box (See below).

ScreenHunter_02 Apr. 15 13.18

Here is what I what I did after I was done with installing Ubuntu:

  1. Install phpmyadmin for easy MySQL administration

    sudo apt-get install libapache2-mod-auth-mysql php5-mysql phpmyadmin
  2. Edit the php.ini file to make sure it works correctly with MySQL

    sudo nano /etc/php5/apache2/php.ini

    Add the following line to the end: extension=mysql.so
  3. Restart Apache

    sudo /etc/init.d/apache2 restart
  4. Install Java

    sudo apt-get install sun-java6-bin
  5. Create your MySQL database

    Browse to http://servername/phpmyadmin

    Login with root and the password you configured during the MySQL installation

    On the main page, locate Privileges and scroll down to Add a new User

    Use the following on the New User screen:
    Username = Enter a username, I used 'openfire'
    Host = From the drop down menu, select localhost
    Password = Enter a password, retype your password

    Under the Database for user section of that page, click on the radio button for Create database with same name and grant all privileges.

    At the bottom of the page, click on the Go button.
  6. Download Openfire

    wget http://www.igniterealtime.org/downloadServlet?filename=openfire/openfire_3.6.3_all.deb
  7. Install Openfire

    sudo dpkg -i openfire_3.6.3_all.deb

Bam! Now all you have to do is browse to http://servername:9090 and run the setup wizard like Matt Lestock does in the video, with the exception of selecting the MySQL database part. For that you just have to enter the database name, and the database user info that we setup in step 5. Easy!

Now that this puppy is all setup, we people can stop wasting space on the mail servers with lengthy email chains. They can collaborate easier, even across the country, you name it! Also, chat sessions are secure over TLS encryption. Not to mention you can add custom filters to prevent leaks of sensitive information.

One thing it also has the ability to do, which we have not implemented is the option to use it as a IM gateway to talk with other IM services on the internet. That might be fun to do for personal use, but I am not comfortable with that in the office.

If you want to try it out, community support is available on the Openfire website here: (Openfire support)

Happy collaborating!

Apr 3, 2009

Yet Another Defrag Utility!

Here at Bauer-Power we believe in keeping machines running at peak performance. At the forefront of peak performance, in Windows anyway, is keeping your hard drive defragmented. Think about a fragmented hard drive like your desk at work. When everything is nice, neat, and in the right place it is easy to find what you are looking for right? When your desk is messy, with random junk scattered around it makes things harder to find.

The same goes with hard drives. When your hard drive is heavily fragmented, it is like that messy desk. It takes a few extra seconds to find all of the missing parts to each file you want to work with. When everything is nicely defragmented. Everything is put away in nice orderly blocks, and it takes less time to find everything the computer needs to run.

We have recommended a few defraggers in the past. Namely Dirms, JKDefrag, and Defraggler. So what's one more right? I heard about this new one on Tekzilla. It is called Smart Defrag. Here is a brief description from IObit, the makers of Smart Defrag:

Disk fragmentation is generally main cause of slow and unstable computer performance. Smart Defrag helps defragment your hard drive most efficiently. Smart Defrag not only defragments computer deeply but optimizes disk performance. With 'install it and forget it' feature, Smart Defrag works automatically and quietly in the background on your PC, keeping your hard disk running at its speediest. Smart Defrag is completely free for home, organization, and business.




Smart Defrag works on all versions of Windows including all server editions. This is good news, because keeping your servers running smoothly is just as, if not more important than keeping your workstations running smoothly. Also, server disk defragmenters can be expensive. Why not save a few extra bucks with this and put it to better use.... like upgrading RAM ;-P

What defragmenters do you use? Do you tend to stick to the built in defrag utility? Would you trust a free defrag utility on production servers? Why or why not? Let me know in the comments.

Technorati Tags: , , , , ,


Mar 31, 2009

How To Install VMWare Server 2 on Ubuntu 9.04 Server

I just finished installed Ubuntu 9.04 server, and VMWare Server 2 and I can't be more pleased with this pair! The last time I wrote about installing VMWare server was back in Ubuntu 8.04. With that you had to install VMWare server using the VMWare-any-any patch. I can tell you that this is NOT the case in VMWare server 2 on Ubuntu 9.04!

No, I guess the VMWare team finally decided to make an Ubuntu friendly version of their software, and made installation relatively easy! Also, they have done away with the VMWare server console and have gone with a very easy to use Web console! That means less headache for you and me!

So lets get to the nitty gritty. This is what you need to do to install VMWare Server 2 on Ubuntu 9.04:


Note: Since ubuntu server doesn't have a GUI, I had to register for the download then download it to my Windows machine. after that I transferred the VMWare server TAR ball over to a SAMBA share on my Ubuntu 9.04 server. Feel free to get the tarball onto your server any way you see fit ;-)

Now, to the install steps:

  1. Install the necesarry prerequisites


  2. sudo apt-get install linux-headers-`uname -r` build-essential xinetd

  3. Now change into the directory where you saved your VMWare TAR ball. I saved mine in /home/paul/vm

    cd /home/paul/vm


  4. Now Extract the TAR ball and run the installer

    tar xvfz VMware-server-*.tar.gz

    cd vmware-server-distrib

    sudo ./vmware-install.pl


  5. You can hit enter for all of the defaults, except when asked for a name of an alternate administrator. For that enter your username. If you don't, you will have to reset the root password as that is the default administrator for VMWare Server. Also, when asked for the direcrory of virtual machines, you have the option of saving them in a different directory. I for instance like to save mine in /home/paul/vm.
  6. Near the end, you will have to enter your serial number which you received when you first registered for the download.
  7. Done! Now you are ready to login! You can login to the web console by browsing to http://servername:8222




After building your first machine, you can view it through the web console in IE and Firefox after installing the correct plugins which you will be prompted for in the web console.

That is it! Seriously! Can you believe it? In just about all previous Ubuntu versions, and VMWare Server versions it has been 1.5 bitches to install. This time it is really really easy!

After installing it, let me know what you think in the comments!

[EDIT 6/2/09] - A Lot of you have been experiencing the vsock error. I have not experienced this, but for those of you that have, I found a Pearl script that should resolve your problem. You can download it here: (VSOCK FIX)

To run it open a terminal, and do the following:

  1. CD into the directory where the Pearl script is

    >cd /path/to/vmware-config.pl

  2. Make the script executable

    >chmod +x vmware-config.pl

  3. Patch!

    >sudo patch /usr/bin/vmware-config.pl /path/to/vmware-config.pl
Let me know if that works or not. If not, and if you found a better solution, let me know!

Technorati Tags: , , ,

Dec 9, 2008

How To Setup A REALLY Fast Dedicated Torrenting Server

Have you ever wanted a super fast torrent server in order to more quickly download and seed those Ubuntu iso's? If you answered with a super excited "More Tom Cruise movies!" then this article may, sadly, be for you. I'm not going to do a lot of hand holding in this article, I more want to show you just what is possible with the power of Linux and some disposable income.

What you are going to need:
  • Some basic Linux command line know how, or the ability to do some Googling and learning.
  • Disposable income - anywhere from 40-60 dollars a month.

The Why

You may be wondering why you need money to follow this tutorial, and the answer is that ideally you are going to need either a dedicated or virtualized server in a datacenter running Linux that you have full root access to. Why the need for a datacenter? Well, if you really want this to be a super fast torrent server you are going to need a super big internet connection. Super big/massive/ginormous/huge internet connections are found in datacenters. Hence the need for some extra monthly cash.

If you are thinking ahead a bit, you may be wondering how having a server download your torrent is going to help. After all, once the file(s) are downloaded to the server, you will still need to download them to your home PC. Well, here's the deal, most residential internet connections are heavy on the download capability but measly on the upload side. This means that even though you may have a 10 meg download, if you have a measly 1 meg upload a lot of your downloads on torrents that do not have many seeders in relation to downloaders (a lot of them) will be much slower. If in contrast you had a 100mb NIC that you could use to capacity... well then you could upload a TON and in return get a lot more downloaded to the server in an extremely short amount of time. I have had my own dedicated torrent server setup and running for about 2 months now, and it has cut the time it takes me to get a large download in half. The other side benefit of having a dedicated box for torrents is that you no longer need to keep your desktop up and running throughout the night, and you can continue to seed as seeding will no longer suck up your very finite amount of bandwidth at home.

The Hardware

Alright, now that you have bought into the idea, exactly where should you go to rent a dedicated server? I have had great experience with The Planet , and they are the ones that currently host my torrent server. I am paying a mere $40 a month for my dedicated Celeron box with a gig of RAM, and that includes 750 GB of bandwidth every month. That is a REAL 750 GB of bandwidth a month, these guys aren't kidding around with you. Basically, I can download/upload as much as my 100 MB NIC can handle... it is pretty impressive to see 20 megs of data coming in and out of the box every second. Now, I didn't get that server for $40 a month by looking at their main server offerings, I looked in their bargain bin. Your torrent server doesn't need much as far as processor/memory goes, so picking up some older hardware cheaply is a great way to go. Think of the bargain bin as a woot off for servers, check the page out a couple of times a day and when you see one at a price you like have that have that credit card ready. They get sold really fast, in fact I didn't get my first choice because I was a little slow on entering in all my information.

Note: I'd highly recommend paying the extra $10 a month for the 100 MB NIC card instead of the 10, this will allow you to really utilize the bandwidth available to you.

Once you have put your order in you should have a server that is all setup and ready for you to SSH into (Use Putty if running Windows) within 3 days.

The Software

Now the fun part, we are getting so close to torrent nirvana I can smell it! If you ordered from The Planet you will probably have a CentOs box. If so, follow the intructions here to install rtorrent.

If you are running Ubuntu try the instructions here.

Note: Make sure you install "screen". This is absolutely necessary if you want to leave rtorrent up and running, there are instructions on how to install and use it in the CentOs instructions.

At this point you have an extremely lean and fast torrent application, rtorrent. But what the heck do you do with it? It is just a mostly black screen with some text on it and no instructions on how to get it to do anything. Well my friend, check out the nice command list on the official rtorrent user guide.

This is where rtorrent gets really awesome and flexible, you can set it up so that it will actually monitor a directory for torrent files and automatically start downloading using the parameters that you set in the rtorrent configuration file. Where the rtorrent.rc file is located is going to depend on where you installed the application, so find it and open it with your text editor of choice (try nano if you are new to Linux). This is where you can tell rtorrent what directory to monitor for new torrents, how fast to download/upload, what ratio to stop at and much more. The great thing about having rtorrent monitor a particular directory is that you can then FTP to that directory and just upload a torrent from your PC to the server and it will automatically start downloading the torrent. Once it is done you can FTP to the directory where you told rtorrent to save the files and start downloading that new Tom Cruise movie to your desktop :).

Conclusion

There is even more sweet stuff we could do at this point, but I haven't gotten around to it yet. But my idea is that you setup some sort of streaming video service that you can then point your PC to in order to just stream the movie in high quality, this way you wouldn't have to download the whole file in order to start watching. If you did this it would literally be possible to start downloading a popular movie (some open source/free one of course) and be streaming it in 5-10 minutes. Awesome.

If you have any comments/suggestions for improvement please let me know in the comments. Specifically, if you know of any other good choices for cheap dedicated servers I'd love to hear about them.

Thanks for giving me the opportunity to write on Bauer-Power El Di Pablo!

About the author:
Rob Steenwyk is the owner/author of Bud Boy Tech, a blog focusing on technology and geekiness. He'd really appreciate it if you added the Bud Boy Tech RSS feed to your feed reader : )

Dec 21, 2007

Know More About Load Balancing

Load balancing, by definition, is the process of spreading the amount of work that is conducted by a computer system between a number of different computer systems to increase the speed that the work is completed in. There are several different methods in which load balancing can be accomplished by and the technique can use many different types of computer components, including both hardware and software applications. Load balancing is typically completed using a cluster of computer servers that may or may not be located in the same location. Some load balancers provide a mechanism for doing something special in the event that all backend servers are unavailable. This might include forwarding to a backup load balancer, or displaying a message regarding the outage. Load balancing can be useful when dealing with redundant communications links.


There are many different companies that see the benefits of using load balancing and implement the procedure for their companies. Companies that conduct business transactions in large numbers using the internet are prime candidates to use load balancing to ensure that all of their clients and customers will be able to conduct their transactions in a quickly and accurately manner. Companies that need to network a great deal of computers for individual users also typically use load balancing to ensure that all computers will work properly and have the right amount of power to be able to perform the functions that they are intended to perform. It also ensures that the company will still be able to do business if one server becomes corrupted or goes down for an extended period of time.


There are several different methods that are widely used for load balancing. One of the most popular methods of load balancing is Global Server Load Balancing. This technique distributes the incoming tasks to a group of servers in a particular geographic location. This technique is widely used by companies that have a global presence and have a need to satisfy customers or employees in many different geographical locations. Using Global Server Load Balancing ensures that the work load is distributed throughout the entire server system in an easy to manage manner and ensures that all geographical locations are obtaining the correct information from the correct set of servers.


Another load balancing technique that is commonly used is called Persistence Load Balancing. This technique assigns each new client to a different server in a round robin (distributed page requests evenly to one of three Squid cache servers) type of allocation. This client is then assigned to this specific server for the future of their relationship with the business. This ensures that no one server is overloaded with a particular type of client, such as those in a certain geographical area or use a specific type of service and ensures that the clients are distributed evenly through out all of the servers that the business possesses. These server assignments are typically monitored by using the customers IP address as the customer's unique identification code.



By Amy Nutt

About the Author:

Managed IT services can include: Managed Hosting and Infrastructure, Application Services, Disaster Recovery and Professional Services. See the benefits of using load balancing and implement the procedure for your clients.

Article Source: www.articlesbase.com



Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | stopping spam