Monday, November 26, 2012

Monitoring System Performance Server 2008

Windows Server 2008 : Monitoring System Performance (part 1) - Key Elements to Monitor for Bottlenecks




Capacity analysis is not about how much information you can collect; it is about collecting the appropriate system health indicators and the right amount of information. Without a doubt, you can capture and monitor an overwhelming amount of information from performance counters. There are more than 1,000 counters, so you’ll want to carefully choose what to monitor. Otherwise, you might collect so much information that the data will be hard to manage and difficult to decipher. Keep in mind that more is not necessarily better with regard to capacity analysis. This process is more about efficiency. Therefore, you need to tailor your capacity-analysis monitoring as accurately as possible to how the server is configured.
Every Windows Server 2008 R2 server has a common set of resources that can affect performance, reliability, stability, and availability. For this reason, it’s important that you monitor this common set of resources, namely CPU, memory, disk, and network utilization.
In addition to the common set of resources, the functions that the Windows Server 2008 R2 server performs can influence what you should consider monitoring. So, for example, you would monitor certain aspects of system performance on file servers differently than you would for a domain controller running on Windows Server 2008 R2 AD DS. There are many functional roles (such as file and print sharing, application sharing, database functions, web server duties, domain controller roles, and more) that Windows Server 2008 R2 can perform, and it is important to understand all those roles that pertain to each server system. By identifying these functions and monitoring them along with the common set of resources, you gain much greater control and understanding of the system.
The following sections go more in depth on what specific items you should monitor for the different components that constitute the common set of resources. It’s important to realize, though, that there are several other items that should be considered regarding monitoring in addition to the ones described in this article. You should consider the following material a baseline of the minimum number of things to begin your capacity-analysis and performance-optimization procedures.

Key Elements to Monitor for Bottlenecks

As mentioned, four resources compose the common set of resources: memory and pagefile usage, processor, disk subsystem, and network subsystem. They are also the most common contributors to performance bottlenecks. A bottleneck can be defined in two ways. The most common perception of a bottleneck is that it is the slowest part of your system. It can either be hardware or software, but generally speaking, hardware is usually faster than software. When a resource is overburdened or just not equipped to handle higher workload capacities, the system might experience a slowdown in performance. For any system, the slowest component of the system is, by definition, considered the bottleneck. For example, a web server might be equipped with ample RAM, disk space, and a high-speed network interface card (NIC), but if the disk subsystem has older drives that are relatively slow, the web server might not be able to effectively handle requests. The bottleneck (that is, the antiquated disk subsystem) can drag the other resources down.
A less common, but equally important, form of bottleneck is one where a system has significantly more RAM, processors, or other system resources than the application requires. In these cases, the system creates extremely large pagefiles, has to manage very large sets of disk or memory sets, yet never uses the resources. When an application needs to access memory, processors, or disks, the system might be busy managing the idle resource, thus creating an unnecessary bottleneck caused by having too many resources allocated to a system. Thus, performance optimization means not having too few resources, but also means not having too many resources allocated to a system.

Monitoring System Memory and Pagefile Usage

Available system memory is usually the most common source for performance problems on a system. The reason is simply that incorrect amounts of memory are usually installed on a Windows Server 2008 R2 system. Windows Server 2008 R2 tends to consume a lot of memory. Fortunately, the easiest and most economical way to resolve the performance issue is to configure the system with additional memory. This can significantly boost performance and upgrade reliability.
There are many significant counters in the memory object that could help determine system memory requirements. Most network environments shouldn’t need to consistently monitor every single counter to get accurate representations of performance. For long-term monitoring, two very important counters can give you a fairly accurate picture of memory pressure: Page Faults/sec and Pages/sec memory. These two memory counters alone can indicate whether the system is properly configured and experiencing memory pressure. Table 1 outlines the counters necessary to monitor memory and pagefile usage, along with a description of each.
Table 1. Important Counters and Descriptions Related to Memory Behavior
Object Counter Description
Memory Committed Bytes Monitors how much memory (in bytes) has been allocated by the processes. As this number increases above available RAM so does the size of the pagefile as paging has increased.
Memory Pages/sec Displays the amount of pages that are read from or written to the disk.
Memory Pages Output/sec Displays virtual memory pages written to the pagefile per second. Monitor this counter to identify paging as a bottleneck.
Memory Page Faults/sec Reports both soft and hard faults.
Process Working Set, _Total Displays the amount of virtual memory that is actually in use.
Paging file %pagefile in use Reports the percentage of the paging file that is actually in use. This counter is used to determine if the Windows pagefile is a potential bottleneck. If this counter remains above 50% or 75% consistently, consider increasing the pagefile size or moving the pagefile to a different disk.

By default, the Memory tab in Resource Monitor, shown in Figure 1, provides a good high-level view of current memory activity. For more advanced monitoring of memory and pagefile activity, use the Performance Monitor snap-in.
Figure 1. Memory section of the Resource Monitor.

Systems experience page faults when a process requires code or data that it can’t find in its working set. A working set is the amount of memory that is committed to a particular process. When this happens, the process has to retrieve the code or data in another part of physical memory (referred to as a soft fault) or, in the worst case, has to retrieve it from the disk subsystem (a hard fault). Systems today can handle a large number of soft faults without significant performance hits. However, because hard faults require disk subsystem access, they can cause the process to wait significantly, which can drag performance to a crawl. The difference between memory and disk subsystem access speeds is exponential even with the fastest hard drives available. The Memory section of the Resource Monitor in Performance Monitor includes columns that display working sets and hard faults by default.
The Page Faults/sec counter reports both soft and hard faults. It’s not uncommon to see this counter displaying rather large numbers. Depending on the workload placed on the system, this counter can display several hundred faults per second. When it gets beyond several hundred page faults per second for long durations, you should begin checking other memory counters to identify whether a bottleneck exists.
Probably the most important memory counter is Pages/sec. It reveals the number of pages read from or written to disk and is, therefore, a direct representation of the number of hard page faults the system is experiencing. Microsoft recommends upgrading the amount of memory in systems that are seeing Pages/sec values consistently averaging above 5 pages per second. In actuality, you’ll begin noticing slower performance when this value is consistently higher than 20. So, it’s important to carefully watch this counter as it nudges higher than 10 pages per second.
Note
The Pages/sec counter is also particularly useful in determining whether a system is thrashing. Thrashing is a term used to describe systems experiencing more than 100 pages per second. Thrashing should never be allowed to occur on Windows Server 2008 R2 systems because the reliance on the disk subsystem to resolve memory faults greatly affects how efficiently the system can sustain workloads.

System memory (RAM) is limited in size and Windows supplements the use of RAM with virtual memory, which is not as limited. Windows will begin paging to disk when all RAM is being consumed, which, in turn, frees RAM for new applications and processes. Virtual memory resides in the pagefile.sys file, which is usually located in the root of the system drive. Each disk can contain a pagefile. The location and size of the pagefile is configured under the Virtual Memory section, shown in Figure 2.
Figure 2. Virtual Memory configuration options.

To access the Performance Options window, do the following:
1.
Click Start.
2.
Right-click on Computer and select Properties.
3.
Click on the Advanced System Settings link on the left.
4.
When the System Properties window opens, click the Settings button under the Performance section.
5.
Select the Advanced tab.
6.
Click Change under Virtual Memory.
Tip
Windows will normally automatically handle and increase the size of pagefile.sys as needed; however, in some cases you might want to increase performance and manage virtual memory settings yourself. Keeping the default pagefile on the system drive and adding a second pagefile to another hard disk can significantly improve performance.
Spanning virtual memory across multiple disks or simply placing the pagefile.sys on another, less-used disk, will also allow Windows to run faster. Just ensure that the other disk isn’t slower than the disk pagefile.sys is currently on. The more physical memory a system has, the more virtual memory will be allocated.

SharePoint Calculated Fields Functions and Formulas

I was trying to figure out what I have available to me as far as Calculated Fields within SharePoint and came across this good article.  Looks like basically Excel Formulas are what will work:

http://www.sharepointpanda.com/2009/04/sharepoint-calculated-field-functions-and-formulas/




Monday, November 19, 2012

Active Directory Exploration

I use the following tools for AD Exploration:

Active Directory Explorer

Active Directory Users and Groups

cmd line - net user lanID /domain

Saturday, November 17, 2012

Using RoboCopy for backups - use this a ton, just documenting (-:


There are many paid and free software solutions available to backup critical data and files on a computer system. Many users, however, are unaware of an inbuilt Windows 7 command called ROBOCOPY (Robust File Copy) that allows users to create simple or highly advanced backup strategies.

In its simplist form, ROBOCOPY can be likened to an extension of XCOPY on steroids. Some of the more important features that make ROBOCOPY an attractive backup alternative are:
  • multi-threaded copying
  • mirroring or synchronisation mode between the destination and source
  • automatic retry and copy resumption
The examples shown below are primarily geared towards less experienced users, especially those that are unfamilar with batch file creation and running. More experienced users, however, are welcome to explore some of the advanced functionality offered by ROBOCOPY here:

http://technet.microsoft.com/en-us/library/cc733145(WS.10).aspx

and also here:

http://www.theether.net/download/Mic...s/robocopy.pdf

or by simply typing robocopy /? at a cmd window prompt.


Note   Note
ROBOCOPY is a FOLDER copier, not a FILE copier - the source and destination syntax arguments in ROBOCOPY can only be folder names.


 Creating a BACKUP strategy

The easiest way to use the ROBOCOPY command to create a backup strategy is by creating a batch (.BAT) file. A batch file is a simple text file, that when executed, runs several commands one after the other.

Step 1

Click  and in the search box, type notepad. Select Notepad to open up a new blank Notepad document.


Step 2
Type the ROBOCOPY commands into the Notepad document, save it as a .BAT file, and then execute it.

In the example below, I have 3 folders (Data1, Data2, and Data3) containing some data that I wish to backup. One folder is located on E: drive and the other two are located on F: drive. I wish to back these up as follows:

Data1 folder on E: backup to a folder called Backups on G: (external USB drive)
Data2 folder on F: backup to a folder called Backups on G: (external USB drive)
Data3 folder on F: backup to a folder called Backups on Q: (network storage drive)

The general format of the ROBOCOPY command is:

Code:
 
robocopy <source> <destination> <options>
In the empty Notepad document, the simplist form of the command would look like this:

Code:
 
robocopy E:\Data1 G:\Backups\Data1
robocopy F:\Data2 G:\Backups\Data2
robocopy F:\Data3 Q:\Backups\Data3
pause
Tip   Tip
If source or destination paths contain spaces in them, enclose these in double quotes e.g. "C:\My Data\My Music"


Only the source
  • E:\Data1
  • F:\Data2
  • F:\Data3
and the destination
  • G:\Backups\Data1
  • G:\Backups\Data2
  • Q:\Backups\Data3
are mandatory inputs into the ROBOCOPY command.

Tip   Tip
The PAUSE command at the bottom of the .BAT file allows the cmd window to stay open after it has completed to allow me to see the output from ROBOCOPY.


If I save the .BAT file to my Desktop, and run it by double-clicking it, then a cmd window is opened and the .BAT file executes the three ROBOCOPY commands as shown below.

ROBOCOPY - Create Backup Script-r1.jpg

The same information is repeated for every ROBOCOPY line in the .BAT file.



In order to utilise some of the powerful functionality in ROBOCOPY, I need to utilise some options in the ROBOCOPY command line. In this next example I want to edit my existing backup strategy such that:
  1. All sub-folders within my data folders are backed up, even if they are empty.
  2. The backup only copies newer files to my existing backup - this means a faster backup time.
  3. The percentage progress counter for copying is not shown - this neatens the overall appearance of the ROBOCOPY information, and creates a smaller log file.
  4. The information normally echoed to the cmd window is saved to a log file that I can examine at a later stage.
In order to do this, I need to specify some additional options in my ROBOCOPY commands like this:

Code:
 
robocopy E:\Data1 G:\Backups\Data1 /e /mir /np /log:backup_log.txt
robocopy F:\Data2 G:\Backups\Data2 /e /mir /np /log+:backup_log.txt
robocopy F:\Data3 Q:\Backups\Data3 /e /mir /np /log+:backup_log.txt
pause
Where:
/e = copy all sub-folders, even empty ones
/mir = mirror (check the files in the destination, and only copy newer files)
/np = no progress counter
/log: = create a logfile

Tip   Tip

Note the use of the /log+: option in the 2nd and 3rd line of the .BAT file. This option ensures that the results of the 2nd and 3rd ROBOCOPY are appended to the log file created in the 1st ROBOCOPY line, meaning I only need one log file to capture all the information I backup.

The log file is always saved to the same folder as the .BAT file - in my case, the folder is saved to my Desktop.


warning   Warning
Use the /MIR option with caution - it has the ability to delete a file from both the source and destination under certain conditions.

This typically occurs if a file/folder in the destination has been deleted, causing ROBOCOPY to mirror the source to the destination. The result is that the same files in the source folder are also deleted. To avoid this situation, never delete any files/folders from the destination - delete them from the source, and then run the backup to mirror the destination to the source.

If in doubt, do not use the /MIR option, but be aware that backups will take longer. 


Since the output from ROBOCOPY is written to the log file, the cmd window will not display the output from ROBOCOPY. If I wish to have this information written to both the log file and the cmd window for visual tracking of the backup process, then I can add the /tee option to each line in the .BAT file, as shown below.

Code:
 
robocopy E:\Data1 G:\Backups\Data1 /e /mir /np /tee /log:backup_log.txt
robocopy F:\Data2 G:\Backups\Data2 /e /mir /np /tee /log+:backup_log.txt
robocopy F:\Data3 Q:\Backups\Data3 /e /mir /np /tee /log+:backup_log.txt
pause
This is an example of the typical output to the the log file - it looks exactly the same as what is echoed to the cmd window.

Code:
 
-------------------------------------------------------------------------------
   ROBOCOPY     ::     Robust File Copy for Windows                              
-------------------------------------------------------------------------------
  Started : Sun Sep 18 23:35:01 2011
   Source : E:\Data1\
     Dest : G:\Backups\Data1\
    Files : *.*
 
  Options : *.* /S /E /COPY:DAT /PURGE /MIR /R:1000000 /W:30 
------------------------------------------------------------------------------
                    2 E:\Data1\
------------------------------------------------------------------------------
               Total    Copied   Skipped  Mismatch    FAILED    Extras
    Dirs :         1         0         1         0         0         0
   Files :         2         0         2         0         0         0
   Bytes :   442.1 k         0   442.1 k         0         0         0
   Times :   0:00:00   0:00:00                       0:00:00   0:00:00
   Ended : Sun Sep 18 23:35:01 2011
Since one of the data folders I am backing up is being copied across a network, I want to ensure that any possible network outages do not cause some critical files to be skipped in the backup. To do this, I can make use of the /z option in the 3rd line of my .BAT file (backup to my network storage) as shown below.

Code:
 
robocopy E:\Data1 G:\Backups\Data1 /e /mir /np /tee /log:backup_log.txt
robocopy F:\Data2 G:\Backups\Data2 /e /mir /np /tee /log+:backup_log.txt
robocopy F:\Data3 Q:\Backups\Data3 /e /mir /np /z /tee /log+:backup_log.txt
This option implements a "retry" for the copying. If I were part way through the copying process, and I lost connection with the network, then ROBOCOPY would automatically restart the copying at the point of failure once the network connection was re-established. It would retrying a million times every 30 seconds (the default settings shown in the image above). The only drawback with this option, is that it can significantly increase the backup time.

ROBOCOPY also has the ability to perform faster multi-threaded copying by simply using the option /mt. I can choose the number of threads to use by specifying a number between 1 and 128 after the /mt option, or if I just specify /mt without a number then the it will use 8 threads by default. In the example below, I use 4 threads to copy to my USB drive, and 8 threads (no number) to copy to my network drive.

Code:
 
robocopy E:\Data1 G:\Backups\Data1 /e /mir /np /tee /mt:4 /log:backup_log.txt
robocopy F:\Data2 G:\Backups\Data2 /e /mir /np /tee /mt:4 /log+:backup_log.txt
robocopy F:\Data3 Q:\Backups\Data3 /e /mir /np /z /tee /mt /log+:backup_log.txt
Below is a template that you can use to create your own backup strategy using ROBOCOPY. Simply copy and paste the lines into a blank text document, and then edit as appropriate.

Code:
rem --- Edit the lines below to create your own backup strategy
rem --- The /mir option has been left out for safety sake
rem --- Add more lines for each new folder requiring backup
rem --- Specified 4 threads to use for multi-threaded copying
rem --- The results of the backup can be found in my_backup_log.txt 
robocopy <source> <destination> /e /np /tee /mt:4 /log:my_backup_log.txt
robocopy <source> <destination> /e /np /tee /mt:4 /log+:my_backup_log.txt
robocopy <source> <destination> /e /np /tee /mt:4 /log+:my_backup_log.txt
pause
Tip   Tip

The can automate your backup's by using the Windows 7 Task Scheduler to run the .BAT file at specific times.

For more information please refer to this tutorial by Orbital Shark:
Task Scheduler - Create New Task


I hope this brief tutorial helps you access the power and simplicity of ROBOCOPY to create some effective backup strategies.

Regards,
Golden

Friday, November 16, 2012

Galaxy SIII Android Phone - Wireless Network Drops - *#0011#

I recently came into ownership of a sweet Galaxy SIII phone - but noticed that when using the phone as a wifi hotspot it would randomly disconnect from a wifi connnection...

Go into the admin screen through the phone dialer using *#0011#, then go into the admin screen and turn off the wifi power save feature.  Problem solved.

Wednesday, November 14, 2012

Remote Blob Storage in SharePoint

Interesting Architectural Decision Point within SharePoint - apparently the advent of SkyDrive in SP 2013 deprecates the need for this...

http://blogs.msdn.com/b/opal/archive/2010/03/24/faq-sharepoint-2010-remote-blob-storage-rbs.aspx


Tuesday, November 6, 2012

Copying Content Databases to other SP 2007 Environments.... PART TWO MISNAMED DATABSE



So when I attached the new database back from TEST into DEV, I misnamed the new database (it had the name of an app within the database name that I found out later did not exist in the content database).  Now that I am on the straight and narrow path toward enlightenment, I needed to fix this to be concise.

Therefore...

1)  stsadm -o deletecontentdb -url (Site Collection URL) -databasename badDatabaseName

2)  SSMS - detach the badDatabaseName

3)  Restore from badDatabaseName backup into goodDatabaseName

3)  Rename the badDatabaseName to goodDatabaseName

4) stsadm -o addContentDb -url (Site Collection URL) -databasename goodDatabaseName

Thursday, November 1, 2012

RDP Session Timeout, etc. Server Admin Stuffs as the great Indian used to pluarlize.

http://technet.microsoft.com/en-us/library/cc754272.aspx

SCCM vs SCOM

I was educating one of my colleagues on SCCM vs SCOM - just for grins:

http://social.technet.microsoft.com/Forums/en/configmgrgeneral/thread/9776151f-a0be-4d15-8f03-25483bd5d379

http://en.wikipedia.org/wiki/System_Center_Operations_Manager

Oracle User Audting

http://docs.oracle.com/cd/E11882_01/network.112/e16543/auditing.htm

Copying Content Databases to other SP 2007 Environments....

I'm having the pleasure of working with a very sharp Consultant that has been teaching me the ways of SharePoint 2007.  I know I know - old technology, but I am no SP Dev, nor have I ever been.  Sure I've built a couple of WebParts and the such, but I'm now able to participate in a project that consists of three very well written SP 2007 Applications - object models mapped to business classes mapped to lists and much much more...

Yesterday, I learned that one could copy content databases from say a TEST environment back to a DEV environment to migrate data and Site Collections.

Some of the keys I learned:

- If there is a managed path created in the test environment - which there is in my Test environment called "/apps", the same managed path needs to exist in the target SP environment.

- The Site Collections that exist in the TEST environment (/apps/App1, /apps/App2 ,etc) must not be created in the target DEV environment

- To figure out what content database is tied to each Site Collection DO THIS.

- I have specific notes created for this process and will attach them soon, but basically, this site goes through the process of detaching (backup works too) the source database, moving it, restoring it and then using stsadm -addcontentdb to reattach the database to a Web App.  After that, the contained Site Collections within the content db are then created.

http://sharepointsherpa.com/2008/04/18/sharepoint-2007-move-content-and-configuration-databases/

Probably pretty basic for y'all with actual SP Dev experience, new to me nonetheless: