Wednesday, December 1, 2010

Automatically import PSTs into Exchange 2007 and 2010

PST Importer 2010

Automatically import PSTs into Exchange 2007 and 2010

PST Importer 2010 boxshot
PST Importer 2010 handles the tricky business of migrating PSTs to MS Exchange in one simple, transparent process. Exchange 2007 and Exchange 2010 are supported. It locates every PST file on your network automatically and transfers them straight to the appropriate mailbox.

Sunday, June 13, 2010

Microsoft System Center Data center Protection.

Microsoft System Center Data center Protection.
Data Protection Manager (DPM) 2010 is part of the System Center family of management products from Microsoft. It delivers unified data protection for Windows servers such as SQL Server, Exchange, SharePoint, Virtualization and file servers -- as well as Windows desktops and laptops.
  • New in 2010 is the ability for roaming laptops to get centrally managed policies around desktop protection.
  • Your laptop data will be protected whether you are connected to the corporate network or travelling on an airplane.
  • DPM also provides native site-to-site replication for Disaster Recovery to either another DPM server or an off-site cloud provider.
  • Centrally managed System State and Bare Metal Recovery are also new in DPM 2010.
DPM seamlessly uses disk, tape, and cloud-based repositories to deliver an easy-to-use and best-of-breed backup and recovery solution for Windows environments from Microsoft. Windows customers of all sizes can rely on Microsoft to provide a scalable and manageable protection solution that is cost-effective, secure, and reliable

Tuesday, May 18, 2010

The Proper Destruction of Data


The Proper Destruction of Data


Data (as non-random one and zero bits) are by necessity mated with physical storage media when at rest. That data becomes information when the bits can be interpreted for some useful purpose is all well and good, but information has a lifecycle in which its usefulness declines over time. Though plans for some information may have very long lifetimes (such as a permanent newspaper archive), but other information may have data retention periods that can be measured in weeks, months, or a number of years less than a decade.
Whenever its retention period ends, data should simply go away. There are numerous practical and strategic reasons why this is the case. From the perspective of the management of storage capacity, logical data deletion frees up storage capacity and eases management processes. But what happens if the information was sensitive or confidential? In many cases, unless overwritten with new data older data can most likely be recovered. That is unacceptable.
However, the end of the data retention period is not the only time that sensitive or confidential information may be exposed to unauthorized third parties. Say that an IT organization wishes to replace older disk drives with newer ones at the end of a lease period. Or a disk drive may have enough problems (such as bad blocks) that it needs to be returned for a replacement under warranty.
If the replacement is planned, then the data can be migrated to new physical media, a process that implies that the data is no longer available on the original media. But the actual data destruction has to be a conscious decision that involves more work than mere logical deletion. If the replacement is unplanned, say, due to the sudden mechanical failure of a hard disk drive (HDD), all the data remains as it was originally unless steps are taken
to destroy it.
Data Destruction – Best Practices Meet Common Sense
But how should organizations go about the data destruction process? Until recently the National Industry Security Program (NISP) Operating Manual (DoD 5220.22-M) gave U.S. governmental
guidelines for “media sanitization,” which is the public sector term for data destruction.
However, the new “Guidelines for Media Sanitization” (NIST Special Publication 800-88) lists the recommendations (from the National Institute of Standards and Technology) that government agencies should follow.
While private organizations need not follow these guidelines, the recommendations are logical and straightforward. Although a large number of electronic storage media are covered including HDDs, mobile computing devices, and memory devices, the Publication does not (and recognizes that it cannot) identify all current and future devices. For example, Fibre Channel (FC) drives are notably absent. As a result, organizations need to follow the guidelines with both common sense and best practices.
The Publication describes three levels of media sanitization — clearing, purging, and destroying. Clearing is designed to prevent robust keyboard attacks. That is, the data must not be able to be retrieved from data, disk, or file recovery utilities by keystroke recovery efforts from standard input devices or more sophisticated data scavenging tools. Overwriting media with non-sensitive data is a recommended practice for clearing.
Purging is a process designed to protect data against a laboratory attack, where highly trained people and sophisticated signal processing equipment are used to recover data from media out of their normal operating environments, such as standalone Winchester
disk drives. Winchester drives, the solutions commonly used today, encapsulate the disk platters with the read/write mechanisms enclosed in a sealed unit.
Purging ranks as the highest level of security that does not involve actual physical destruction of the media. That means that the hard drives can be reused with new data so the investment in the drives is protected. When drives have been removed from its normal operating environment, the Publication recommends that data be purged with a SecureErase command. Firmware-based SecureErase can be executed to destroy the data (and in the
process perform both the clear and purging functions) for most ATA drives over 15 GB that were manufactured after 2001
.
Another purging process, degaussing, uses a strong magnetic field to destroy data on magnetic media such as HDDs and tape. Naturally, degaussing cannot be used on optical media, such as CDs and DVDs. Degaussing a hard drive typically renders inoperative the firmware that manages drive processes. Thus the drive can no longer be used to read and write data even though it has not been physically destroyed.
Finally, destroying is a process typically reserved for circumstances where absolute destruction of data is required. In these cases, physical storage media is rendered beyond the point where any data could be recovered by either a keyboard or a laboratory attack, no matter how sophisticated. Disintegration, incineration, pulverization, and melting are processes that completely destroy the data along with the physical media.
Now clearing, purging, and destroying processes are implemented at the level of individual pieces of media rather than at the level of selected pieces of information, such as files. In cases of planned data destruction (unplanned data destruction, such as sending a hard disk back for unplanned warranty work, cannot be predicted), IT has to plan in advance to try and make sure that sensitive and confidential information is confined to as few pieces of media as possible. At the same time, IT should try to ensure that the end of a data retention period is as close to the same as possible for all the data on a given piece of media.
DestructData — Solutions for IT
Of course a number of software tools and hardware solutions (such as degaussing equipment) exist for helping IT cope with data destruction processes. One company at the forefront
of this emerging market (especially in light of all the attention to data security breaches) is DestructData, the exclusive distributor of products made by CPR Tools, an engineering company with deep expertise in data recovery and data destruction.
DestructData’s Hammer is a portable standalone device that can be used to purge PATA/SATA hard drives using the NSA-developed SecureErase software (or a CPR Tools’ utility for drives that do not have SecureErase-enabled capability). The Hammer process includes both the verification that the data has been removed as well as an audit trail; information that may be required for organizational or legal reasons.
DestructData’s SCSI Hammer can direct connect and purge four hard drives at once or can connect to a disk array with up to 30 drives, simultaneously purging the drives without the need to remove them from the enclosure. Verification that the data has been erased and that an audit trail is produced allows the erased drives to be reused with fresh data.
Though the ability to erase Fibre Channel (FC) drives is conspicuously absent from the Hammer solution family, DestructData says that capability is on the horizon. DestructData also offers the DX-CD2 Data Destroyer. This device grinds the data layer on a CD-ROM to 250 microns, which leaves the data beyond forensic recovery. This approach is
superior to typical methods including dimpling, shredding, and disintegration which can leave 15% to 100% of digital data in recoverable form. Of course, the disc is no longer usable to store data, but hey, the plastic can be sold if there is enough of it.
DestructData also offers PSIClone, which is a standalone, hand-held data recovery lab for difficult to recover data. PSIClone is useful for forensic investigators and others can use its capabilities to recover good data that is damaged or appears to have been destroyed in accidents such as fire and flood, or head crash on a disk. However, PSIClone cannot recover
data purged with DestructData’s Hammer or SCSI Hammer.)
Summary
With all the hullabaloo about data breaches and the need to maintain data privacy coupled with overall compliance regulations as well as the need to dispose of data properly that meet the requirements of the Federal Rules of Civil Procedure, more and more attention is going to be paid to data destruction. Large companies with a lot of data devices where either the data or the media has reached its end of life will find the type of technology useful
as in-house capability. However, some of these companies as well as smaller businesses (and even individuals) may turn to third party service professionals to help them with media sanitization.
Because of that increase importance of performing media sanitization properly, devices from companies such as from DestructData are going to be considered very carefully. And DestructData makes products that meet the appropriate level of media sanitization — clearing, purging, or destroying — that is necessary.

By David Hill, Mesabi Group

Wednesday, April 14, 2010

3 stubborn PC problems you can fix


3 stubborn PC problems you can fix

Ever notice how each PC has a personality of its own? Or maybe even multiple personalities? In the course of a week, your computer may act friendly, moody, and sometimes downright mean.
However, don't take a hammer to your PC just yet. The following is a list of common symptoms and treatments to help even the most troublesome PCs. You don't even have to be a psychologist (at least not yet) to deal with your PC's neuroses.
Windows 7 and Windows Vista usually manage this automatically, but overall you'll find that these tips work for all versions of Windows, from Windows 95 to Windows 7.

1. You keep getting a "your system is running low on virtual memory" message

Perhaps you're more than familiar with this scenario: You're working on your PC and notice performance getting gradually slower and slower. Programs become harder to open and close. You wait forever for Web pages to be displayed. And then, you get some serious-sounding "virtual memory is too low" message, like the one in the following graphic.
Don't worry: This message isn't as scary as it sounds.
Example of a Virtual Memory is low message.
Virtual memory low message
Virtual memory is the space your computer uses when it's short of RAM (Random Access Memory), which is the memory used when running programs like Microsoft Office Word or Microsoft Office PowerPoint.
So what can you do to correct this problem and prevent this message from coming up in the future? The following are some solutions to keep your computer from displaying the "virtual memory minimum is too low" message.
Solution 1: Bump up the virtual memory size on your computer
The first solution is to increase your computer's virtual memory settings. To do so, you first need to determine how much RAM you currently have.

Windows 7

Solution 2: Add more RAM to your computer
If you keep getting that dreaded "Your system is running low on virtual memory" message—even after you increase your computer's virtual memory—then you may need to buy more memory for your computer. To really work well:
  • Windows 7 needs at least 1 GB of RAM to run. See more system requirements for Windows 7.
  • Windows Vista needs at least 512 MB of RAM to run, but for some applications (like gaming) 1 GB or more of RAM is recommended.
  • Windows XP needs a minimum of 256 MB of RAM.
The more RAM you have, the better.

Find out how much RAM you have in your computer

If you're at work, contact your company's IT administrator before updating the memory on your computer. They may have some memory available and can help you install it.
If you do need to purchase some more memory, stop by your local computer shop. You can probably buy memory from them, and they'll probably install it for you. Or, you can buy memory online.

2. Your windows slide off the desktop—and you can't grab them

We're all familiar with moving program windows around the desktop. You can click-and-hold the window's title bar to move it around. But what do you do when you accidentally move a window's title bar off the desktop so you can't grab it anymore? The window is stuck in that inconvenient position.
Solution: Use your keyboard to help move your window
The trick to moving these stubborn program windows is to use your keyboard.

Use your keyboard to move a window:

3. Your taskbar has disappeared

The taskbar is that horizontal bar at the bottom or your computer screen that displays open programs on your desktop. The taskbar also contains the Start menu, which allows you to navigate to various programs installed on your computer. In many ways, it's your command central.
Thus, there's nothing more frustrating than going to start a program, only to find the taskbar gone. A computer without a taskbar will bring you to a grinding halt.
The good news is that the taskbar never disappears—it just hides. It may be hiding behind other open windows, or at the top or side of your screen. You can also (unintentionally) make the taskbar so thin that it seems invisible.
The following are possible reasons why your taskbar has vanished, as well as solutions to keep your taskbar from ever running away again.

Solution 1: Find your taskbar behind other windows

Solution 2: Find your taskbar elsewhere on your screen

Solution 3: Thicken your taskbar

Where to find more help

This article covers three common PC problems. But if you're still unable to find the solution to your particular PC problems, check out the Microsoft support page. There, you'll find various self-support and assisted support solutions. You'll find answers to cure even the most disturbed computer.

Microsoft Windows HPC Server 2008


New to Windows  HPC Server 2008 ?

get acquainted.

A Competitive Advantage

High Performance Computing gives analysts, engineers and scientists the computation resources they need to make better decisions, fuel product innovation, speed research and development, and accelerate time to market. Some examples of HPC usage include: decoding genomes, animating movies, analyzing financial risks, streamlining crash test simulations, modeling global climate solutions and other highly complex problems.

More Accessible Than Ever

In the past, the most common way to apply multiple compute cycles to a complex problem was to use specialized supercomputing hardware – a solution with a very high cost of entry and technical complexity.
However, recent software and hardware advances have made it possible to leverage existing IT skills and create an HPC environment using off-the-shelf servers and high speed interconnects. These systems can deliver industry-leading computing power with more efficiency and at a significantly lower cost of entry and ownership.  This form of HPC is called a commodity HPC cluster. 

Basic Architecture of an HPC Cluster

A cluster consists of several servers networked together where each server in the cluster performs one or more specific tasks.  Cluster components include Head Nodes, and Compute Nodes, Job Scheduler and Broker Nodes (for SOA enabled clusters.)

Head Node

The single point of management and job scheduling for the cluster. It provides failover and controls and mediates access to the cluster resources.

Compute Node

Carries out the computational tasks assigned to it by the job scheduler.

Job Scheduler

Queues jobs and their associated tasks. It allocates resources to these jobs, initiates the tasks on the compute nodes; and monitors the status of jobs, tasks, and compute nodes.

Broker Node

Act as intermediaries between the application and the services. The broker load-balances the service requests to the services, and finally return results to the application.

Sunday, April 11, 2010

Microsoft Windows SteadyState

Windows SteadyState


Share computers, not headaches

What state is your shared computer in at the end of the day?
  • Hard disk filled with downloaded files?
  • Strange options configured?
  • Programs installed that you don't want?
  • System infected with viruses and spyware?
  • Computer bogged down for unknown reasons?
Windows SteadyState, successor to the Shared Computer Toolkit, is designed to make life easier for people who set up and maintain shared computers.
An easy way to manage multiple users
An easy way to manage multiple users
You can manage whole groups of users as single user accounts. The new Windows SteadyState console makes it easier than ever to create and modify user profiles.
A locked-down platform for stable shared computing
Not every computer user should have access to every software capability. Your system can be more stable and consistent when you limit user access to control panel functions, network resources, and other sensitive areas.
Set it and forget it
Set it and forget it
Once you have everything set up the way you want it, you can share the computer and rest easy. Any changes a user might make to the configuration or hard disk can be undone by simply restarting the machine.

Thursday, April 8, 2010

Microsoft Security Compliance Manager

Microsoft Security Compliance Manager



The Microsoft security experts on the Solution Accelerators team are working on a free new tool to help organizations plan, deploy, operate, and manage security baselines for Windows® client and server operating systems, and Microsoft applications. Check out this new video about Security Compliance Manager from the dev team; then tell us what your favorite feature is!

Wednesday, April 7, 2010

Overview of Remote Desktop Web Access

Overview of Remote Desktop Web Access (RD Web Access)

New Feature  from Microsoft

Applies To: Windows Server 2008 R2

Remote Desktop Web Access (RD Web Access), formerly Terminal Services Web Access (TS Web Access), enables users to access RemoteApp and Desktop Connection through the Start menu on a computer that is running Windows 7 or through a Web browser. RemoteApp and Desktop Connection provides a customized view of RemoteApp programs and virtual desktops to users.
Additionally, RD Web Access includes Remote Desktop Web Connection, which enables users to connect remotely from a Web browser to the desktop of any computer where they have Remote Desktop access.
When a user starts a RemoteApp program, a Remote Desktop Services session is started on the Remote Desktop Session Host (RD Session Host) server that hosts the RemoteApp program. If a user connects to a virtual desktop, a remote desktop connection is made to a virtual machine that is running on a Remote Desktop Virtualization Host (RD Virtualization Host) server.
To provide users access to RemoteApp and Desktop Connection, you must configure RD Web Access to specify the source that provides the RemoteApp programs and virtual desktops that are displayed to users. You can configure RD Web Access to use either of the following:
  • Remote Desktop Connection Broker (RD Connection Broker) server
  • RemoteApp source
An RD Connection Broker server provides users access to virtual desktops hosted on RD Virtualization Host servers and to RemoteApp programs hosted on RD Session Host servers. To configure the RD Connection Broker server, use the Remote Desktop Connection Manager tool. For more information, see the Remote Desktop Connection Manager Help in Windows Server 2008 R2. For more information about RemoteApp and Desktop Connection, see the Remote Desktop Services page on the Windows Server 2008 R2 TechCenter (http://go.microsoft.com/fwlink/?LinkId=143108).
A RemoteApp source is an individual RD Session Host server or a farm of identically configured RD Session Host servers on which RemoteApp programs have been configured. You can specify multiple RemoteApp sources. To configure RemoteApp programs on an RD Session Host server, use RemoteApp Manager. For more information,

Deploying RD Web Access

You must install the RD Web Access role service on the server that you want users to connect to over the Web to access RemoteApp programs. When you install RD Web Access, Microsoft Internet Information Services (IIS) is also installed as a required component.
After you install RD Web Access, you must specify whether to use an RD Connection Broker server or a RemoteApp source as the source that provides the RemoteApp programs and virtual desktops that are displayed to users through RemoteApp and Desktop Connection. For more information, see Configure the RD Web Access Server for RemoteApp and Desktop Connection.
If you want users to access the Web page from the Internet, you can use RD Gateway to help secure remote connections. For more information, see Checklist: Make RemoteApp Programs Available from the Internet.
For more information about RD Web Access, see the Remote Desktop Services page on the Windows Server 2008 R2 TechCenter

 (http://go.microsoft.com/fwlink/?LinkId=140437).

Emotional intelligence, leadership, management, teams, team building


Emotional intelligence, leadership, management, teams, team building


Tuesday, April 6, 2010

Intel Labs announces Single-chip Cloud Computing experimental chip

Intel Labs announces Single-chip Cloud Computing experimental chip

Cluster Computing

Cluster Computing


A computer cluster is a group of linked Computer  working together closely so that in many respects they form a single computer. The components of a cluster are commonly, but not always, connected to each other through fast LAN. Clusters are usually deployed to improve performance and/or availability over that of a single computer, while typically being much more cost-effective than single computers of comparable speed or availability.

Cloud computing


Cloud computing logical diagram

Cloud computing is Internet-based computing, whereby shared resources, software and information are provided to computers and other devices on-demand, like a public utility.

It is a paradigm shift following the shift from mainframe to client-server that preceded it in the early '80s. Details are abstracted from the users who no longer have need of, expertise in, or control over the technology infrastructure "in the cloud" that supports them.[1] Cloud computing describes a new supplement, consumption and delivery model for IT services based on the Internet, and it typically involves the provision of dynamically scalable and often virtualized resources as a service over the Internet.[2][3] It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet.[4]

The term cloud is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the telephone network,[5] and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents.[6] Typical cloud computing providers deliver common business applications online which are accessed from another web service or software like a web browser, while the software and data are stored on servers.

A technical definition is "a computing capability that provides an abstraction between the computing resource and its underlying technical architecture (e.g., servers, storage, networks), enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction."[7] This definition states that clouds have five essential characteristics: on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service.[7]

The majority of cloud computing infrastructure, as of 2009, consists of reliable services delivered through data centers and built on servers. Clouds often appear as single points of access for all consumers' computing needs. Commercial offerings are generally expected to meet quality of service (QoS) requirements of customers and typically offer SLAs.[8]