Blog on installation process utilizing Least Priv approach:
http://www.sharepointassist.com/2012/08/04/sharepoint-2013-lab-build-part-1-getting-started/
GPO info on Server 2013:
http://www.win2012workstation.com/password-restrictions/
Saturday, December 29, 2012
Tuesday, December 25, 2012
Wednesday, December 12, 2012
Dependency injection
"Dependency injection is a software design pattern that allows a choice of component to be made at run-time rather than compile time. This can be used, for example, as a simple way to load plugins dynamically or to choose mock objects in test environments vs. real objects in production environments. This software design pattern injects the depended-on element (object or value etc) to the destination automatically by knowing the requirement of the destination. Another pattern, called dependency lookup, is a regular process and reverse process to dependency injection."
Much to learn...
http://en.wikipedia.org/wiki/Dependency_injection
Much to learn...
http://en.wikipedia.org/wiki/Dependency_injection
2010 Flash Crash - Crash of May6, 2010, 2:45... Algorithims.
http://en.wikipedia.org/wiki/2010_Flash_Crash
The May 6, 2010 Flash Crash[1] also known as The Crash of 2:45, the 2010 Flash Crash or just simply, the Flash Crash, was a United States stock market crash on Thursday May 6, 2010 in which the Dow Jones Industrial Average plunged about 1000 points (about 9%) only to recover those losses within minutes. It was the second largest point swing, 1,010.14 points, and the biggest one-day point decline, 998.5 points, on an intraday basis in Dow Jones Industrial Average history.[2][3][4]
Great TED on Algorithims:
http://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world.html
The May 6, 2010 Flash Crash[1] also known as The Crash of 2:45, the 2010 Flash Crash or just simply, the Flash Crash, was a United States stock market crash on Thursday May 6, 2010 in which the Dow Jones Industrial Average plunged about 1000 points (about 9%) only to recover those losses within minutes. It was the second largest point swing, 1,010.14 points, and the biggest one-day point decline, 998.5 points, on an intraday basis in Dow Jones Industrial Average history.[2][3][4]
Great TED on Algorithims:
http://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world.html
Cool stuff Enterprise Library 5.0 – May 2011...
http://msdn.microsoft.com/en-us/library/ff632023.aspx
Enterprise Library consists of reusable software components that are designed to assist developers with common enterprise development challenges. It includes a collection of functional application blocks addressing specific cross-cutting concerns such as data access, logging, or validation; and wiring blocks, Unity and the Interception/Policy Injection Application Block, designed to help implement more loosely coupled testable, and maintainable software systems.
Different applications have different requirements, and you will find that not every application block is useful in every application that you build. Before using an application block, you should have a good understanding of your application requirements and of the scenarios that the application block is designed to address. Note that this release of the Enterprise Library includes a selective installer that allows you to choose which of the blocks you wish to install.
Microsoft Enterprise Library 5.0 contains the following application blocks:
- Caching Application Block. Developers can use this application block to incorporate a cache in their applications. Pluggable cache providers and persistent backing stores are supported.
- Cryptography Application Block. Developers can use this application block to incorporate hashing and symmetric encryption in their applications.
- Data Access Application Block. Developers can use this application block to incorporate standard database functionality in their applications, including both synchronous and asynchronous data access and returning data in a range of formats.
- Exception Handling Application Block. Developers and policy makers can use this application block to create a consistent strategy for processing exceptions that occur throughout the architectural layers of enterprise applications.
- Logging Application Block. Developers can use this application block to include logging functionality for a wide range of logging targets in their applications. This release further improves logging performance.
- Policy Injection Application Block. Powered by the Interception mechanism built in Unity, this application block can be used to implement interception policies to streamline the implementation of common features, such as logging, caching, exception handling, and validation, across a system.
- Security Application Block. Developers can use this application block to incorporate authorization and security caching functionality in their applications.
- Unity Application Block. Developers can use this application block as a lightweight and extensible dependency injection container with support for constructor, property, and method call injection, as well as instance and type interception.
- Validation Application Block. Developers can use this application block to create validation rules for business objects that can be used across different layers of their applications.
Word 2010 Page Break
http://office.microsoft.com/en-us/word-help/insert-a-page-break-HA010368779.aspx?CTT=1
Ctrl-Enter - forgot won't lie.
Ctrl-Enter - forgot won't lie.
Math.Round - Midpoint Rounding Enumeration
Math.Round in c# might not do what you think it does.
http://msdn.microsoft.com/en-us/library/system.midpointrounding.aspx
If you don't specify the enumeration above, it defaults to .5 goes DOWN:
http://msdn.microsoft.com/en-us/library/system.midpointrounding.aspx
If you don't specify the enumeration above, it defaults to .5 goes DOWN:
4.5 --> 4
http://msdn.microsoft.com/en-us/library/3s2d3xkk.aspx
Interesting Article RE bankers rounding:
http://www.xbeat.net/vbspeed/i_BankersRounding.htm
Monday, November 26, 2012
Monitoring System Performance Server 2008
Windows Server 2008 : Monitoring System Performance (part 1) - Key Elements to Monitor for Bottlenecks | ||||||||||||
| ||||||||||||
Capacity analysis
is not about how much information you can collect; it is about collecting the
appropriate system health indicators and the right amount of information.
Without a doubt, you can capture and monitor an overwhelming amount of
information from performance counters. There are more than 1,000 counters, so
you’ll want to carefully choose what to monitor. Otherwise, you might collect so
much information that the data will be hard to manage and difficult to decipher.
Keep in mind that more is not necessarily better with regard to capacity
analysis. This process is more about efficiency. Therefore, you need to tailor
your capacity-analysis monitoring as accurately as possible to how the server is
configured.
Every Windows
Server 2008 R2 server has a common set of resources that can affect performance,
reliability, stability, and availability. For this reason, it’s important that
you monitor this common set of resources, namely CPU, memory, disk, and network
utilization.
In addition to
the common set of resources, the functions that the Windows Server 2008 R2
server performs can influence what you should consider monitoring. So, for
example, you would monitor certain aspects of system performance on file servers
differently than you would for a domain controller running on Windows Server
2008 R2 AD DS. There are many functional roles (such as file and print sharing,
application sharing, database functions, web server duties, domain controller
roles, and more) that Windows Server 2008 R2 can perform, and it is important to
understand all those roles that pertain to each server system. By identifying
these functions and monitoring them along with the common set of resources, you
gain much greater control and understanding of the system.
The following
sections go more in depth on what specific items you should monitor for the
different components that constitute the common set of resources. It’s important
to realize, though, that there are several other items that should be considered
regarding monitoring in addition to the ones described in this article. You
should consider the following material a baseline of the minimum number of
things to begin your capacity-analysis and performance-optimization
procedures.
Key Elements to Monitor for Bottlenecks
As mentioned,
four resources compose the common set of resources: memory and pagefile usage,
processor, disk subsystem, and network subsystem. They are also the most common
contributors to performance bottlenecks. A bottleneck can be defined in two
ways. The most common perception of a bottleneck is that it is the slowest part
of your system. It can either be hardware or software, but generally speaking,
hardware is usually faster than software. When a resource is overburdened or
just not equipped to handle higher workload capacities, the system might
experience a slowdown in performance. For any system, the
slowest component of the system is, by definition, considered the bottleneck.
For example, a web server might be equipped with ample RAM, disk space, and a
high-speed network interface card (NIC), but if the disk subsystem has older
drives that are relatively slow, the web server might not be able to effectively
handle requests. The bottleneck (that is, the antiquated disk subsystem) can
drag the other resources down.
A less
common, but equally important, form of bottleneck is one where a system has
significantly more RAM, processors, or other system resources than the
application requires. In these cases, the system creates extremely large
pagefiles, has to manage very large sets of disk or memory sets, yet never uses
the resources. When an application needs to access memory, processors, or disks,
the system might be busy managing the idle resource, thus creating an
unnecessary bottleneck caused by having too many resources allocated to a
system. Thus, performance optimization means not having too few resources, but
also means not having too many resources allocated to a system.
Monitoring System Memory and Pagefile Usage
Available
system memory is usually the most common source for performance problems on a
system. The reason is simply that incorrect amounts of memory are usually
installed on a Windows Server 2008 R2 system. Windows Server 2008 R2 tends to
consume a lot of memory. Fortunately, the easiest and most economical way to
resolve the performance issue is to configure the system with additional memory.
This can significantly boost performance and upgrade reliability.
There are
many significant counters in the memory object that could help determine system
memory requirements. Most network environments shouldn’t need to consistently
monitor every single counter to get accurate representations of performance. For
long-term monitoring, two very important counters can give you a fairly accurate
picture of memory pressure: Page Faults/sec and Pages/sec memory. These two
memory counters alone can indicate whether the system is properly configured and
experiencing memory pressure. Table 1 outlines the counters necessary to monitor memory and
pagefile usage, along with a description of each.
By default, the
Memory tab in Resource Monitor, shown in Figure 1, provides a good high-level view of current memory
activity. For more advanced monitoring of memory and pagefile activity, use the
Performance Monitor snap-in.
Figure 1. Memory section of the Resource Monitor.
Systems
experience page faults when a process requires code or data that it can’t find
in its working set. A working set is the amount of memory that is committed to a
particular process. When this happens, the process has to retrieve the code or
data in another part of physical memory (referred to as a soft fault) or, in the
worst case, has to retrieve it from the disk subsystem (a hard fault). Systems
today can handle a large number of soft faults without significant performance
hits. However, because hard faults require disk subsystem access, they can cause the process to wait
significantly, which can drag performance to a crawl. The difference between
memory and disk subsystem access speeds is exponential even with the fastest
hard drives available. The Memory section of the Resource Monitor in Performance
Monitor includes columns that display working sets and hard faults by
default.
The Page
Faults/sec counter reports both soft and hard faults. It’s not uncommon to see
this counter displaying rather large numbers. Depending on the workload placed
on the system, this counter can display several hundred faults per second. When
it gets beyond several hundred page faults per second for long durations, you
should begin checking other memory counters to identify whether a bottleneck
exists.
Probably
the most important memory counter is Pages/sec. It reveals the number of pages
read from or written to disk and is, therefore, a direct representation of the
number of hard page faults the system is experiencing. Microsoft recommends
upgrading the amount of memory in systems that are seeing Pages/sec values
consistently averaging above 5 pages per second. In actuality, you’ll begin
noticing slower performance when this value is consistently higher than 20. So,
it’s important to carefully watch this counter as it nudges higher than 10 pages
per second.
Note
The
Pages/sec counter is also particularly useful in determining whether a system is
thrashing. Thrashing is a term used to describe systems experiencing more than
100 pages per second. Thrashing should never be allowed to occur on Windows
Server 2008 R2 systems because the reliance on the disk subsystem to resolve
memory faults greatly affects how efficiently the system can sustain
workloads.
System memory
(RAM) is limited in size and Windows supplements the use of RAM with virtual
memory, which is not as limited. Windows will begin paging to disk when all RAM
is being consumed, which, in turn, frees RAM for new applications and processes.
Virtual memory resides in the pagefile.sys file, which is usually located in the
root of the system drive. Each disk can contain a pagefile. The location and
size of the pagefile is configured under the Virtual Memory section, shown in Figure 2.
Figure 2. Virtual Memory configuration options.
To access the Performance Options
window, do the following:
Tip
Windows will normally automatically
handle and increase the size of pagefile.sys as needed; however, in some cases
you might want to increase performance and manage virtual memory settings
yourself. Keeping the default pagefile on the system drive and adding a second
pagefile to another hard disk can significantly improve performance.
Spanning
virtual memory across multiple disks or simply placing the pagefile.sys on
another, less-used disk, will also allow Windows to run faster. Just ensure that
the other disk isn’t slower than the disk pagefile.sys is currently on. The more
physical memory a system has, the more virtual memory will be
allocated.
|
SharePoint Calculated Fields Functions and Formulas
I was trying to figure out what I have available to me as far as Calculated Fields within SharePoint and came across this good article. Looks like basically Excel Formulas are what will work:
http://www.sharepointpanda.com/2009/04/sharepoint-calculated-field-functions-and-formulas/
http://www.sharepointpanda.com/2009/04/sharepoint-calculated-field-functions-and-formulas/
Monday, November 19, 2012
Active Directory Exploration
I use the following tools for AD Exploration:
Active Directory Explorer
Active Directory Users and Groups
cmd line - net user lanID /domain
Active Directory Explorer
Active Directory Users and Groups
cmd line - net user lanID /domain
Saturday, November 17, 2012
Using RoboCopy for backups - use this a ton, just documenting (-:
There are many paid and free software solutions available to backup critical data and files on a computer system. Many users, however, are unaware of an inbuilt Windows 7 command called ROBOCOPY (Robust File Copy) that allows users to create simple or highly advanced backup strategies.
In its simplist form, ROBOCOPY can be likened to an extension of XCOPY on steroids. Some of the more important features that make ROBOCOPY an attractive backup alternative are:
http://technet.microsoft.com/en-us/library/cc733145(WS.10).aspx
and also here:
http://www.theether.net/download/Mic...s/robocopy.pdf
or by simply typing robocopy /? at a cmd window prompt.
In its simplist form, ROBOCOPY can be likened to an extension of XCOPY on steroids. Some of the more important features that make ROBOCOPY an attractive backup alternative are:
- multi-threaded copying
- mirroring or synchronisation mode between the destination and source
- automatic retry and copy resumption
http://technet.microsoft.com/en-us/library/cc733145(WS.10).aspx
and also here:
http://www.theether.net/download/Mic...s/robocopy.pdf
or by simply typing robocopy /? at a cmd window prompt.
Note
ROBOCOPY is a FOLDER copier, not a FILE copier - the source and destination syntax arguments in ROBOCOPY can only be folder names.
Creating a BACKUP strategy
The easiest way to use the ROBOCOPY command to create a backup strategy is by creating a batch (.BAT) file. A batch file is a simple text file, that when executed, runs several commands one after the other.
Step 1
Click and in the search box, type notepad. Select Notepad to open up a new blank Notepad document.
Step 2
Type the ROBOCOPY commands into the Notepad document, save it as a .BAT file, and then execute it.
In the example below, I have 3 folders (Data1, Data2, and Data3) containing some data that I wish to backup. One folder is located on E: drive and the other two are located on F: drive. I wish to back these up as follows:
Data1 folder on E: backup to a folder called Backups on G: (external USB drive)
Data2 folder on F: backup to a folder called Backups on G: (external USB drive)
Data3 folder on F: backup to a folder called Backups on Q: (network storage drive)
The general format of the ROBOCOPY command is:
Code:
robocopy <source> <destination> <options>
Code:
robocopy E:\Data1 G:\Backups\Data1 robocopy F:\Data2 G:\Backups\Data2 robocopy F:\Data3 Q:\Backups\Data3 pause
Tip
If source or destination paths contain spaces in them, enclose these in double quotes e.g. "C:\My Data\My Music"
Only the source
- E:\Data1
- F:\Data2
- F:\Data3
- G:\Backups\Data1
- G:\Backups\Data2
- Q:\Backups\Data3
Tip
The PAUSE command at the bottom of the .BAT file allows the cmd window to stay open after it has completed to allow me to see the output from ROBOCOPY.
If I save the .BAT file to my Desktop, and run it by double-clicking it, then a cmd window is opened and the .BAT file executes the three ROBOCOPY commands as shown below.
The same information is repeated for every ROBOCOPY line in the .BAT file.
In order to utilise some of the powerful functionality in ROBOCOPY, I need to utilise some options in the ROBOCOPY command line. In this next example I want to edit my existing backup strategy such that:
- All sub-folders within my data folders are backed up, even if they are empty.
- The backup only copies newer files to my existing backup - this means a faster backup time.
- The percentage progress counter for copying is not shown - this neatens the overall appearance of the ROBOCOPY information, and creates a smaller log file.
- The information normally echoed to the cmd window is saved to a log file that I can examine at a later stage.
Code:
robocopy E:\Data1 G:\Backups\Data1 /e /mir /np /log:backup_log.txt robocopy F:\Data2 G:\Backups\Data2 /e /mir /np /log+:backup_log.txt robocopy F:\Data3 Q:\Backups\Data3 /e /mir /np /log+:backup_log.txt pause
/e = copy all sub-folders, even empty ones
/mir = mirror (check the files in the destination, and only copy newer files)
/np = no progress counter
/log: = create a logfile
Tip
Note the use of the /log+: option in the 2nd and 3rd line of the .BAT file. This option ensures that the results of the 2nd and 3rd ROBOCOPY are appended to the log file created in the 1st ROBOCOPY line, meaning I only need one log file to capture all the information I backup.
The log file is always saved to the same folder as the .BAT file - in my case, the folder is saved to my Desktop.
Warning
Use the /MIR option with caution - it has the ability to delete a file from both the source and destination under certain conditions.
This typically occurs if a file/folder in the destination has been deleted, causing ROBOCOPY to mirror the source to the destination. The result is that the same files in the source folder are also deleted. To avoid this situation, never delete any files/folders from the destination - delete them from the source, and then run the backup to mirror the destination to the source.
If in doubt, do not use the /MIR option, but be aware that backups will take longer.
This typically occurs if a file/folder in the destination has been deleted, causing ROBOCOPY to mirror the source to the destination. The result is that the same files in the source folder are also deleted. To avoid this situation, never delete any files/folders from the destination - delete them from the source, and then run the backup to mirror the destination to the source.
If in doubt, do not use the /MIR option, but be aware that backups will take longer.
Since the output from ROBOCOPY is written to the log file, the cmd window will not display the output from ROBOCOPY. If I wish to have this information written to both the log file and the cmd window for visual tracking of the backup process, then I can add the /tee option to each line in the .BAT file, as shown below.
Code:
robocopy E:\Data1 G:\Backups\Data1 /e /mir /np /tee /log:backup_log.txt robocopy F:\Data2 G:\Backups\Data2 /e /mir /np /tee /log+:backup_log.txt robocopy F:\Data3 Q:\Backups\Data3 /e /mir /np /tee /log+:backup_log.txt pause
Code:
------------------------------------------------------------------------------- ROBOCOPY :: Robust File Copy for Windows ------------------------------------------------------------------------------- Started : Sun Sep 18 23:35:01 2011 Source : E:\Data1\ Dest : G:\Backups\Data1\ Files : *.* Options : *.* /S /E /COPY:DAT /PURGE /MIR /R:1000000 /W:30 ------------------------------------------------------------------------------ 2 E:\Data1\ ------------------------------------------------------------------------------ Total Copied Skipped Mismatch FAILED Extras Dirs : 1 0 1 0 0 0 Files : 2 0 2 0 0 0 Bytes : 442.1 k 0 442.1 k 0 0 0 Times : 0:00:00 0:00:00 0:00:00 0:00:00 Ended : Sun Sep 18 23:35:01 2011
Code:
robocopy E:\Data1 G:\Backups\Data1 /e /mir /np /tee /log:backup_log.txt
robocopy F:\Data2 G:\Backups\Data2 /e /mir /np /tee /log+:backup_log.txt
robocopy F:\Data3 Q:\Backups\Data3 /e /mir /np /z /tee /log+:backup_log.txt
ROBOCOPY also has the ability to perform faster multi-threaded copying by simply using the option /mt. I can choose the number of threads to use by specifying a number between 1 and 128 after the /mt option, or if I just specify /mt without a number then the it will use 8 threads by default. In the example below, I use 4 threads to copy to my USB drive, and 8 threads (no number) to copy to my network drive.
Code:
robocopy E:\Data1 G:\Backups\Data1 /e /mir /np /tee /mt:4 /log:backup_log.txt robocopy F:\Data2 G:\Backups\Data2 /e /mir /np /tee /mt:4 /log+:backup_log.txt robocopy F:\Data3 Q:\Backups\Data3 /e /mir /np /z /tee /mt /log+:backup_log.txt
Code:
rem --- Edit the lines below to create your own backup strategy rem --- The /mir option has been left out for safety sake rem --- Add more lines for each new folder requiring backup rem --- Specified 4 threads to use for multi-threaded copying rem --- The results of the backup can be found in my_backup_log.txt robocopy <source> <destination> /e /np /tee /mt:4 /log:my_backup_log.txt robocopy <source> <destination> /e /np /tee /mt:4 /log+:my_backup_log.txt robocopy <source> <destination> /e /np /tee /mt:4 /log+:my_backup_log.txt pause
Tip
The can automate your backup's by using the Windows 7 Task Scheduler to run the .BAT file at specific times.
For more information please refer to this tutorial by Orbital Shark:
Task Scheduler - Create New Task
I hope this brief tutorial helps you access the power and simplicity of ROBOCOPY to create some effective backup strategies.
Regards,
Golden
Friday, November 16, 2012
Galaxy SIII Android Phone - Wireless Network Drops - *#0011#
I recently came into ownership of a sweet Galaxy SIII phone - but noticed that when using the phone as a wifi hotspot it would randomly disconnect from a wifi connnection...
Go into the admin screen through the phone dialer using *#0011#, then go into the admin screen and turn off the wifi power save feature. Problem solved.
Go into the admin screen through the phone dialer using *#0011#, then go into the admin screen and turn off the wifi power save feature. Problem solved.
Wednesday, November 14, 2012
Remote Blob Storage in SharePoint
Interesting Architectural Decision Point within SharePoint - apparently the advent of SkyDrive in SP 2013 deprecates the need for this...
http://blogs.msdn.com/b/opal/archive/2010/03/24/faq-sharepoint-2010-remote-blob-storage-rbs.aspx
http://blogs.msdn.com/b/opal/archive/2010/03/24/faq-sharepoint-2010-remote-blob-storage-rbs.aspx
Tuesday, November 13, 2012
Monday, November 12, 2012
Saturday, November 10, 2012
What is the difference between converting and casting in c#?
Converting deals with all base types, casting doesn't...
http://stackoverflow.com/questions/1608801/difference-between-convert-toint32-and-int
Mark Gravell's post sums it up very well - Parse and TryParse...
http://stackoverflow.com/questions/1608801/difference-between-convert-toint32-and-int
Mark Gravell's post sums it up very well - Parse and TryParse...
Friday, November 9, 2012
Tuesday, November 6, 2012
Copying Content Databases to other SP 2007 Environments.... PART TWO MISNAMED DATABSE
So when I attached the new database back from TEST into DEV, I misnamed the new database (it had the name of an app within the database name that I found out later did not exist in the content database). Now that I am on the straight and narrow path toward enlightenment, I needed to fix this to be concise.
Therefore...
1) stsadm -o deletecontentdb -url (Site Collection URL) -databasename badDatabaseName
2) SSMS - detach the badDatabaseName
3) Restore from badDatabaseName backup into goodDatabaseName
3) Rename the badDatabaseName to goodDatabaseName
4) stsadm -o addContentDb -url (Site Collection URL) -databasename goodDatabaseName
Monday, November 5, 2012
Friday, November 2, 2012
I'm supporting some old VB code - they have all these ExecuteScalar calls to the database... What on earth does Scalar mean in the VB World? Return a Single Value!
I know what Scalar means with respect to Physics, but what about database calls? Return a single value!
VB Scalar: http://msdn.microsoft.com/en-us/library/eeb84awz(v=vs.71).aspx
Physics/Mathematical Scalar: http://www.physicsclassroom.com/class/1dkin/u1l1b.cfm
VB Scalar: http://msdn.microsoft.com/en-us/library/eeb84awz(v=vs.71).aspx
Physics/Mathematical Scalar: http://www.physicsclassroom.com/class/1dkin/u1l1b.cfm
Explaining UNION vs UNION ALL to a colleague - just a note...
http://www.w3schools.com/sql/sql_union.asp
Its just like it sounds, UNION ALL gives you everything, duplicates included... UNION removes those duplicates.
Its just like it sounds, UNION ALL gives you everything, duplicates included... UNION removes those duplicates.
Thursday, November 1, 2012
SCCM vs SCOM
I was educating one of my colleagues on SCCM vs SCOM - just for grins:
http://social.technet.microsoft.com/Forums/en/configmgrgeneral/thread/9776151f-a0be-4d15-8f03-25483bd5d379
http://en.wikipedia.org/wiki/System_Center_Operations_Manager
http://social.technet.microsoft.com/Forums/en/configmgrgeneral/thread/9776151f-a0be-4d15-8f03-25483bd5d379
http://en.wikipedia.org/wiki/System_Center_Operations_Manager
Copying Content Databases to other SP 2007 Environments....
I'm having the pleasure of working with a very sharp Consultant that has been teaching me the ways of SharePoint 2007. I know I know - old technology, but I am no SP Dev, nor have I ever been. Sure I've built a couple of WebParts and the such, but I'm now able to participate in a project that consists of three very well written SP 2007 Applications - object models mapped to business classes mapped to lists and much much more...
Yesterday, I learned that one could copy content databases from say a TEST environment back to a DEV environment to migrate data and Site Collections.
Some of the keys I learned:
- If there is a managed path created in the test environment - which there is in my Test environment called "/apps", the same managed path needs to exist in the target SP environment.
- The Site Collections that exist in the TEST environment (/apps/App1, /apps/App2 ,etc) must not be created in the target DEV environment
- To figure out what content database is tied to each Site Collection DO THIS.
- I have specific notes created for this process and will attach them soon, but basically, this site goes through the process of detaching (backup works too) the source database, moving it, restoring it and then using stsadm -addcontentdb to reattach the database to a Web App. After that, the contained Site Collections within the content db are then created.
http://sharepointsherpa.com/2008/04/18/sharepoint-2007-move-content-and-configuration-databases/
Probably pretty basic for y'all with actual SP Dev experience, new to me nonetheless:
Yesterday, I learned that one could copy content databases from say a TEST environment back to a DEV environment to migrate data and Site Collections.
Some of the keys I learned:
- If there is a managed path created in the test environment - which there is in my Test environment called "/apps", the same managed path needs to exist in the target SP environment.
- The Site Collections that exist in the TEST environment (/apps/App1, /apps/App2 ,etc) must not be created in the target DEV environment
- To figure out what content database is tied to each Site Collection DO THIS.
- I have specific notes created for this process and will attach them soon, but basically, this site goes through the process of detaching (backup works too) the source database, moving it, restoring it and then using stsadm -addcontentdb to reattach the database to a Web App. After that, the contained Site Collections within the content db are then created.
http://sharepointsherpa.com/2008/04/18/sharepoint-2007-move-content-and-configuration-databases/
Probably pretty basic for y'all with actual SP Dev experience, new to me nonetheless:
Wednesday, October 31, 2012
Recovery Point Objective and Recovery Time Objective
Hey get your databases back up and running already would you?
RPO: http://en.wikipedia.org/wiki/Recovery_point_objective
RTO: http://en.wikipedia.org/wiki/Recovery_time_objective
RPO: http://en.wikipedia.org/wiki/Recovery_point_objective
RTO: http://en.wikipedia.org/wiki/Recovery_time_objective
Monday, October 29, 2012
Some very bright business minds with some great advice...
There are many gems in this article about the best advice some very successful business
http://money.cnn.com/gallery/news/companies/2012/10/25/best-advice.fortune
http://money.cnn.com/gallery/news/companies/2012/10/25/best-advice.fortune
Windows 8 Cheat Sheet
Some are obvious, some are not...
http://pogue.blogs.nytimes.com/2012/10/25/a-windows-8-cheat-sheet/
http://pogue.blogs.nytimes.com/2012/10/25/a-windows-8-cheat-sheet/
Thursday, October 25, 2012
SQL Server - Server Level Rights
I'm sure this is old hat for many of you out there, but I'm hardly a full time DBA by any means. So - I read back up in prep for setting up some Security Access Groupings at my new gig........
http://www.sqlservercentral.com/articles/Administration/sqlserversecurityfixedroles/1163/
http://www.sqlservercentral.com/articles/Administration/sqlserversecurityfixedroles/1163/
Monday, October 22, 2012
Bulk Insert into Oracle utilizing SSIS
Article found here fyi: http://msdn.microsoft.com/en-us/library/hh923024.aspx
Optimized
Bulk Loading of Data into Oracle
SQL Server
Technical Article
Writer: Carla Sabotta, Debarchan Sarkar
Technical Reviewer: Matt Masson, Jason Howell
Published: 04/2012
Applies to: SQL Server 2005 (all editions), SQL
Server 2008, SQL Server 2008 R2, and SQL Server 2012 (non-Enterprise and
non-Developer editions)
Summary: SQL Server 2008 and SQL Server 2008
R2 (Enterprise and Developer editions) support bulk loading Oracle data using
Integration Services packages with the Microsoft Connector for Oracle by Attunity.
For SQL Server 2005 and the non-Enterprise and non-Developer editions of SQL
Server 2008 and 2008 R2, there are alternatives for achieving optimal
performance when loading Oracle data. This paper discusses these alternatives.
The Enterprise
and Developer editions of SQL Server 2012 also support bulk loading Oracle data
using Integration Services packages with the Microsoft Connector for Oracle by
Attunity.
Copyright
This document is provided “as-is”.
Information and views expressed in this document, including URL and other
Internet Web site references, may change without notice. You bear the risk of
using it.
Some examples depicted herein are provided for illustration
only and are fictitious. No real association or connection is intended or
should be inferred.
This document does not provide you
with any legal rights to any intellectual property in any Microsoft product. You
may copy and use this document for your internal, reference purposes.
© 2011 Microsoft. All rights
reserved.
Contents
Introduction
SQL Server
2008, 2008 R2, and 2012 (Enterprise and Developer editions) support bulk
loading Oracle data using Integration Services (SSIS) packages. The Microsoft
Connector for Oracle by Attunity provides optimal performance through their
high-speed connectors during the loading or unloading of data from Oracle. For
more information, see Using the Microsoft Connector for
Oracle by Attunity with SQL Server 2008 Integration Services (http://msdn.microsoft.com/en-us/library/ee470675(SQL.100).aspx).
SQL Server
2005 and the non-Enterprise and non-Developer editions of SQL Server 2008, 2008
R2, and 2012 don’t provide an out-of-the box option for bulk loading Oracle
data.
·
The
fast load options for the OLE DB destination aren’t available when you use the Oracle
OLE DB provider for Oracle because the provider doesn’t implement the IRowsetFastLoad
(http://msdn.microsoft.com/en-us/library/ms131708.aspx) interface.
In
addition, the current design of SSIS is such that it makes the fast load options
available only for the SQL providers. The options aren’t available for any
other provider even if the provider implements the IRowsetFastLoad interface.
·
The
Microsoft OLE DB Provider for Oracle is deprecated and not recommended to use
against Oracle versions later than 8i.
http://support.microsoft.com/kb/244661
http://support.microsoft.com/kb/244661
In SQL Server
2005 and the non-Enterprise and non-Developer editions of SQL Server 2008, 2008
R2, and 2012, the out-of-the box, SSIS components implement single row inserts
to load data to Oracle. When you use single row inserts, the following issues
may occur.
·
Long
load times and poor performance
·
Data
migration deadlines are not met
·
Timeout
during the ETL process for large production databases (greater than 500 GB)
with complex referential integrity
For these
releases, there are alternatives for achieving optimal performance when loading
Oracle data. This paper discusses these alternatives.
Alternatives for Optimized Loading and Unloading Oracle
Data
The following
are alternatives for optimizing the loading of Oracle data.
Alternative
|
SQL
Server Versions
|
Customized Script
component
|
SQL Server 2005 and
the non-Enterprise and non-Developer editions of SQL Server 2008, 2008 R2,
and 2012.
|
Third-party components
|
SQL Server 2005 and
the non-Enterprise and non-Developer editions of SQL Server 2008 and 2008 R2.
|
Customized Script Component
In this
solution, a Script component is configured as a destination. The component
connects to Oracle using the OLE DB provider from Oracle (OraOLEDB) and bulk
loads data to an Oracle database. The Script component performs the data load in
about half the time that it would take to perform single row inserts using an
OLE DB destination.
The provider
is included in the Oracle Data Access Components (ODAC) that is available for download
on the Oracle Data Access Components site (http://www.oracle.com/technetwork/topics/dotnet/utilsoft-086879.html).
An Oracle online account is required to download the software.
Note: You can also configure the Script component
to connect to Oracle using the ODP.Net provider from Oracle.
The System.Data.OleDb namespace (http://tinyurl.com/7zuffuf ) is used in the script, as shown in
the Microsoft Visual Basic code example below. The namespace is the .NET
Framework Data Provider for OLE DB.
The PreExecute (http://tinyurl.com/86e4exe) method is overridden to create the OleDbParameter
objects for each of the input columns. The parameters are added to the
OleDbCommand Object, to configure the parameterized command that the destination
will use to insert the data. In the example, the input columns are CustomerID,
TerritoryID, AccountNumber, and ModifiedDate. Then, the database transaction is
started.
The AcquireConnections (http://tinyurl.com/7qkkqvq) method is overridden to return a
System.Data.OleDb.OleDbConnection from the connection manager that connects to
the Oracle database.
The ProcessInputRow (http://tinyurl.com/8y7vnh5
) method is overridden to process the data in each input row as it passes
through.
To configure the Script component
- Add
a data source to the package, such as an OLE DB Source. The data source should
have fields that can be easily loaded into a target table. In this
example, we’re using the Customer table in the AdventureWorks database as
the data source, and selecting the CustomerID, TerritoryID, AccountNumber,
and ModifiedDate input columns.
- Add
a Script component and configure the component as a destination. Connect
the component to the data source.
- Double-click
the Script component to open the Script
Transformation Editor.
4. Click Connection Managers
in the left-hand pane.
5. Click Add, and then select
<New connection> in the Connection Manager Field. The Add SSIS Connection Manager dialog box
appears.
6.
In the Add SSIS Connection Manager dialog box,
select ADO.NET in the Connection manager type area, and then
click Add.
7.
In the Configure ADO.NET Connection Manager
dialog box click New to create a new
data connection for the connection manager.
8.
In the Connection Manager dialog box, click
the arrow next to the Provider
drop-down list.
9.
Expand the .Net Providers for OleDb folder, click Oracle Provider for OLE DB, and then
click OK.
10. Click Test
Connection to confirm the connection, and then click OK.
11. In the Configure
ADO.NET Connection Manager dialog box, click the data connection you’ve
created, and then click OK.
12. In the Script
Transformation Editor, click Input
Columns in the left-hand pane, and click the CustomerID, TerritoryID,
AccountNumber, and ModifiedDate columns in the Available Input Columns box.
13. Click Script in the
left-hand pane.
14.
Confirm that the ScriptLanguage property value is
Microsoft Visual Basic, and then Click Edit
Script.
NOTE: In SQL Server
2008, the Design Script button was
renamed to Edit Script and support
was added for the Microsoft Visual C# programming language.
Add the following Visual Basic code.
Imports System
Imports
System.Data
Imports
System.Math
Imports
Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports
Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports
System.Data.OleDb
Imports
System.Data.Common
<Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute>
_
<CLSCompliant(False)> _
Public Class ScriptMain
Inherits
UserComponent
Dim row_count
As Int64
Dim
batch_size As Int64
Dim connMgr
As IDTSConnectionManager100
Dim
oledbconn As OleDbConnection
Dim
oledbtran As OleDbTransaction
Dim
oledbCmd As OleDbCommand
Dim
oledbParam As OleDbParameter
Public Overrides Sub
PreExecute()
batch_size = 8 * 1024
row_count = 0
oledbCmd = New
OleDbCommand("INSERT INTO Customer(CustomerID,
TerritoryID, AccountNumber, ModifiedDate) VALUES(?, ?, ?, ?)",
oledbconn)
oledbParam = New
OleDbParameter("@CustomerID",
OleDbType.Integer, 7)
oledbCmd.Parameters.Add(oledbParam)
oledbParam = New
OleDbParameter("@TerritoryID",
OleDbType.Integer, 7)
oledbCmd.Parameters.Add(oledbParam)
oledbParam = New
OleDbParameter("@AccountNumber",
OleDbType.VarChar, 7)
oledbCmd.Parameters.Add(oledbParam)
oledbParam = New
OleDbParameter("@ModifiedDate",
OleDbType.Date, 7)
oledbCmd.Parameters.Add(oledbParam)
oledbtran =
oledbconn.BeginTransaction()
oledbCmd.Transaction = oledbtran
MyBase.PreExecute()
End Sub
Public Overrides Sub
AcquireConnections(ByVal Transaction As Object)
connMgr = Me.Connections.Connection
oledbconn = CType(connMgr.AcquireConnection(Nothing), OleDb.OleDbConnection)
End Sub
Public Overrides Sub
Input0_ProcessInputRow(ByVal Row As Input0Buffer)
With
oledbCmd
.Parameters("@CustomerID").Value = Row.CustomerID
.Parameters("@TerritoryID").Value = Row.TerritoryID
.Parameters("@AccountNumber").Value = Row.AccountNumber
.Parameters("@ModifiedDate").Value = Row.ModifiedDate
.ExecuteNonQuery()
End With
row_count = row_count + 1
If
(row_count Mod batch_size) = 0 Then
oledbtran.Commit()
oledbtran =
oledbconn.BeginTransaction()
oledbCmd.Transaction = oledbtran
End If
End Sub
Public Overrides Sub
PostExecute()
MyBase.PostExecute()
End Sub
Public Overrides Sub
ReleaseConnections()
MyBase.ReleaseConnections()
End Sub
End Class
15. Save your changes to the Script
component.
The SSIS
package now contains the custom script component, configured as a destination to
bulk load data to the Oracle data source.
Note: The above script component connects to
Oracle, but it can be used to connect to other third-party data sources such as
Sybase and Informix. The only change that you need to make is to configure the
connection manager to use the correct OLE DB providers available for Sybase and
Informix.
Third-party Components
In addition
to the Script component solution discussed in this paper, there are third-party
components that you can use to achieve optimal performance when loading Oracle
data. The following components work with both SQL Server 2005 and SQL Server
2008.
·
Oracle
Destination and ODBC Destination components from CozyRoc. For more information,
see the CozyRoc web site.
·
Progress
DataDirect Connect and DataDirect Connect64 components from Progress DataDirect.
For more information, see the DataDirect web site.
Conclusion
SQL Server
2008, 2008 R2, and 2012 (Enterprise and Developer editions) support bulk loading
Oracle data using SSIS packages.
For other SQL
Server versions and editions, the following are alternatives for optimizing the
loading of Oracle data when using SSIS packages.
Alternative
|
SQL
Server versions and editions
|
Script component bulk
loads data to Oracle using the Oracle OLE DB provider from Oracle
|
SQL Server 2005 and
the non-Enterprise and non-Developer editions of SQL Server 2008, 2008 R2,
and 2012.
|
Third-party components
that connect to Oracle, from CozyRoc, Persistent, and DataDirect
|
SQL Server 2005 and
the non-Enterprise and non-Developer editions of SQL Server 2008 and 2008 R2.
|
For more information:
Connectivity
and SQL Server 2005 Integration Services (http://msdn.microsoft.com/en-us/library/bb332055(SQL.90).aspx
)
SSIS with Oracle Connectors (http://social.technet.microsoft.com/wiki/contents/articles/1957.ssis-with-oracle-connectors.aspx
)
SQL Server
2012: Microsoft Connectors V2.0 for Oracle and Teradata (http://www.microsoft.com/en-us/download/details.aspx?id=29283)
SSIS and Netezza: Loading data using
OLE DB Destination (http://www.rafael-salas.com/2010/06/ssis-and-netezza-loading-data-using-ole.html)
Did this
paper help you? Please give us your feedback. Tell us on a scale of 1 (poor) to
5 (excellent), how would you rate this paper and why have you given it this
rating? For example:
·
Are
you rating it high due to having good examples, excellent screen shots, clear
writing, or another reason?
·
Are
you rating it low due to poor examples, fuzzy screen shots, or unclear writing?
This feedback
will help us improve the quality of white papers we release.
Subscribe to:
Posts (Atom)