Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 15

Dryad and DryadLINQ Installation

and Configuration Guide


Version 1.0.1 – November 10, 2009

Abstract

DryadLINQ depends on the Dryad execution engine to run the queries on a Windows®
HPC Server 2008 cluster. This paper describes:
 How to install Dryad on a Windows HPC cluster.
 How to install DryadLINQ on a workstation and configure it to run
applications on the Windows HPC cluster.
Note:
 Most resources discussed in this paper are provided with the DryadLINQ package.
For information about how to obtain all documents referenced in this paper, see
“Resources” at the end of this paper.
 For project updates, feedback and community discussion, see
http://connect.microsoft.com/ dryad
 Please submit comments on this document at the above community site or to
dryadlnq@microsoft.com.

Contents

Introduction...................................................................................................................3
How to Install Dryad on a Windows HPC Cluster...........................................................4
Cluster System Requirements...................................................................................5
Dryad Installation......................................................................................................6
How to Install DryadLINQ on a Workstation................................................................12
DryadLINQ Workstation System Requirements.......................................................12
How to Install and Configure DryadLINQ.................................................................13
Verify the Installation..............................................................................................14
Resources....................................................................................................................14
Dryad and DryadLINQ Installation and Configuration Guide - 2

Disclaimer: The information contained in this document represents the current view of Microsoft
Corporation on the issues discussed as of the date of publication. Because Microsoft must respond to
changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and
Microsoft cannot guarantee the accuracy of any information presented after the date of publication.
This White Paper is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS,
IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT.
Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under
copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or
transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or
for any purpose, without the express written permission of Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights
covering subject matter in this document. Except as expressly provided in any written license agreement
from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks,
copyrights, or other intellectual property.
Unless otherwise noted, the example companies, organizations, products, domain names, e-mail
addresses, logos, people, places and events depicted herein are fictitious, and no association with any real
company, organization, product, domain name, email address, logo, person, place or event is intended or
should be inferred.
© 2008-2009 Microsoft Corporation. All rights reserved.
Microsoft, Visual C#, Visual Studio, Windows, and Windows Server are either registered trademarks or
trademarks of Microsoft Corporation in the United States and/or other countries.
The names of actual companies and products mentioned herein may be the trademarks of their respective
owners.
Document History
Date Change
Feb 18, 2009 Preview draft v.0.9
June 30, 2009 Version 1.0
November 10, 2009 Version 1.0.1

Version 1.0.1 – November 10, 2009


© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 3

Introduction
Dryad is a high-performance, general-purpose distributed computing engine that is
designed to simplify the task of implementing distributed applications on clusters of
computers running the Windows® operating system. DryadLINQ allows developers to
implement Dryad applications in managed code by using an extended version of the
LINQ programming model and API.
DryadLINQ depends on the Dryad execution engine to run queries on a Windows HPC
Server 2008 cluster. Figure 1 shows the basic setup for running DryadLINQ applications.

Dryad Cluster

Head
Node

...

Client ...
Workstations

Compute Nodes

Figure 1. Dryad cluster configuration

In Figure 1:
client workstations
Programmers develop and run DryadLINQ applications on Windows-based
workstations.
Dryad cluster
The DryadLINQ provider on the workstation runs DryadLINQ queries on the Dryad
cluster as a distributed application. A Dryad cluster is a Windows HPC cluster
running Dryad software.
head node
The cluster’s head node manages the cluster.
compute nodes
The cluster’s compute nodes handle the distributed computations. The head
node can also serve as a compute node. Input and output data is also stored on
the compute nodes in shared folders created during installation of Dryad.
Important: Window HPC Server 2008 supports a set of cluster topologies. Depending
on the cluster topology, a client workstation may or may not be able to communicate
with the compute nodes. However, a DryadLINQ application needs access to the data
stored in files on the compute nodes in between executing distributed LINQ queries.
DryadLINQ developers will have most flexibility when their workstation can access
the compute nodes directly. When compute nodes are isolated on a private network,
Version 1.0.1 – November 10, 2009
© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 4

the DryadLINQ application should run on the head node or the workstation should be
joined to the private network.

This paper assumes that you already have acquired a Windows HPC cluster and are
familiar with its operation. For details, see “Windows HPC Server 2008,” listed in
“Resources” at the end of this paper.

The installation procedures depend on your role:


 Cluster administrators install Dryad on the cluster’s head node and compute
nodes.
For details, see “How to Install Dryad on a Windows HPC Cluster” later in this
paper.
 Developers install DryadLINQ on their workstation and configure it to use the
Dryad cluster.
For details, see “How to Install DryadLINQ on a Workstation” later in this paper.
This document covers only installation procedures. For more information, see these
references listed in “Resources” at the end of this paper:
 For a general discussion of Dryad and DryadLINQ, see “Dryad and DryadLINQ:
An Introduction.”
 For a discussion of how to use DryadLINQ to implement distributed
applications by using Dryad, see “DryadLINQ Programming Guide.”
Note: This paper assumes that you already have acquired a Windows HPC cluster and
are familiar with its operation. For details, see “Windows HPC Server 2008,” listed in
“Resources” at the end of this paper.
 Feedback: Please submit comments on this document at
http://connect.microsoft.com/dryad or to dryadlnq@microsoft.com

How to Install Dryad on a Windows HPC Cluster


A Dryad cluster is a Windows HPC cluster with Dryad software installed. Clusters can
consist of hundreds of individual compute nodes, but Dryad can be useful even on
three- or four-node clusters.
Each computer in a Windows HPC cluster is referred to as a node. There are two node
types:
 The head node manages the cluster and assigns computations to the
compute nodes.
You install Dryad Management Tools on the head node. This service manages the
cleanup of data files after they are no longer needed.
 Compute nodes perform the actual computations.
You install the Dryad computation engine on the compute nodes. This software is
a thin adapter between Dryad and Windows HPC.
Version 1.0.1 – November 10, 2009
© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 5

To install Dryad on a Windows HPC cluster, you must:


 Install Dryad Management Tools on the head node.
 Install the Dryad computation engine on the compute nodes.
This section describes the requirements for a Dryad cluster, and how to install Dryad
on the cluster.

Cluster System Requirements


The following list describes hardware and software requirements that are specific to
running Dryad on a Windows HPC cluster and that exceed the minimum Windows
HPC requirements, as follows:
 Microsoft® HPC Pack 2008 SP1
 At least 200 GB of free hard-drive storage per node
The appropriate amount of hard-drive storage depends on the amount of data
you expect to process. Production compute nodes typically have 1–3 TB of hard-
drive storage per node.
 4–8 GB of RAM
Dryad is functional on cluster computers with relatively modest amounts of RAM.
However, optimization heuristics might assume ample memory, so relatively
small amounts of RAM can limit performance. Dryad is optimized for 8GB or more
of RAM. However, the optimal amount of RAM depends on a number of factors,
and installing more than 4GB of RAM on your compute nodes might not provide
any additional benefit. If you are considering installing more than 4GB, you
should discuss your requirements with Microsoft.
 Gigabit Ethernet
Gigabit Ethernet (GigE) is the recommended minimum for connecting the cluster
computers. Network capacity is often the rate-limiting factor, so faster
networking—such as 10 GigE—typically improves performance.
Ideally, all computers in the cluster are connected to the same dedicated network
switch, but larger clusters typically use a hierarchy of switches. If the cluster
spans multiple racks, you can use IEEE 802.3ad link aggregation to improve
performance between switches.
 Microsoft .NET Framework, Version 3.5 SP1
This version of the .NET Framework must be installed on all compute nodes. If
you install Dryad Management Tools, this version of .NET Framework must also
be installed on the head node.
Dryad Management Tools requires the following:
 Access to a data store
Dryad Management Tools uses the data store to record state information for
those Dryad jobs that are subject to the data retention policy. You must have
access to the data store; otherwise, the installation will fail. The supported data
stores are Microsoft SQL Server® 2005 or later, and SQL Server Express 2005 or
Version 1.0.1 – November 10, 2009
© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 6

later. If you install the Microsoft HPC Pack 2008, the installer automatically
installs SQL Server Express, which is usually sufficient.
For more details on retention policy, see “How to Specify the Data Retention
Policy” later in this paper.
 Access to the Windows HPC cluster
Dryad Management Tools programmatically queries the Windows HPC cluster to
determine the status of Dryad jobs.
 Access to the data shares that are created when you install the Dryad
computation engine on the compute nodes

Dryad Installation
The two Dryad .msi files are used as follows:
 DryadHPC.msi is a full installation that installs Dryad Management Tools on
the head node and installs the Dryad binaries on all online compute nodes.
 DryadHPCComputeNode.msi installs the Dryad computation engine on
selected compute nodes.
Important: You must be an administrator for the Windows HPC cluster to run either
installer.

How to Use DryadHPC.msi to Perform a Full Installation


DryadHPC.msi is typically used for new Dryad installations. The installation wizard
handles either or both of the following:
 Install Dryad Management Tools
 Install the Dryad binaries on all compute nodes

To run DryadHPC.msi

1. Copy DryadHPC.msi to the cluster’s head node.


2. Run DryadHPC.msi, which opens the installation wizard.
Important: You must be a member of the Domain\Administrators or
LocalMachine\Administrator group to run DryadHPC.msi. The way that you run the
installer depends on the User Access Control (UAC) setting on the head node:
 If UAC is set to Off, double-click the file name to run DryadHPC.msi.
 If UAC is set to On, run DryadHPC.msi from the command line, as follows:
1. Open a command window with administrative privileges.
2. Enter the following msiexec command to run the installer:
msiexec /i DryadHPC.msi /l*v DryadHPC_Install.log
3. Accept the EULA when prompted, and move to the next installation wizard
page, which asks whether you want to install Dryad Management Tools.

Version 1.0.1 – November 10, 2009


© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 7

Recommended: Install the management service. The head node must have .NET
Framework Version 3.5 SP1 installed before you can install Dryad Management Tools.

To install Dryad Management Tools

1. On the Dryad Management Tools Installation Options wizard page, select Install
Tools, as shown in Figure 2.

Figure 2. Dryad Management Tools Installation Options page


2. Specify an installation folder.
Dryad Management Tools should be installed on the head node.
3. Click Next to move to the next page:
 If you choose to install the Dryad Management Tools, the next page is
Management Tools Database and HPC Cluster Settings, as shown in Figure 3.
 If you choose not to install the Dryad Management Tools, the next
page is Dryad Installation Options, as shown in Figure 4.

To specify the Management Tools database

1. On the Management Tools Database and HPC Cluster Settings wizard page,
verify the connection information for the data store.
Figure 3 shows the default settings. Your user account must have the rights to
create a database in this data store.
Note: You may choose to install your own database engine. For example, you
may Install SQL Server Express 2005, which would create a server named
SQLEXPRESS by default. You can select a different name for the server, e.g.
SQLDRYAD, by unselecting “Hide advanced options” during the SQL Server
Express installation. Then, during the Dryad installation, you would specify
“LOCALHOST\SQLDRYAD” (assuming you named the database server SQLDRYAD)
in the SQL Server field of Figure 3.

Version 1.0.1 – November 10, 2009


© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 8

Figure 3. Management Tools Database and HPC Cluster Settings page


2. Verify the cluster name.
The cluster information is used by Dryad Management Tools to connect to the
cluster and query for Dryad jobs. It is displayed as a read-only field.
3. Click Next to go to the Dryad Installation Options page.
4. If you want to install the Dryad computation engine on every compute node that
is currently online, select Install Dryad On all the compute nodes in the cluster
5. Click Next to go to the Dryad Installation Options page, as shown in Figure 4.

To install the Dryad binaries on all online compute nodes

1. On the Dryad Installation Options wizard page, specify a drive for the Dryad data
and system shares, as shown in Figure 4.

Figure 4. Dryad Installation Options page


2. Click Next to go to the installation wizard’s completion page.
3. Click Install to install Dryad.
Version 1.0.1 – November 10, 2009
© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 9

Dryad applications depend on shared access to input data. To provide this access, the
installer creates the following shares and folders on the cluster’s head and compute
nodes:
 A share named DryadData to store the user’s input data files.
 A share named XC to store any intermediate data files that might be
generated by Dryad applications.
The intermediate files generated by an application are stored in a folder that is
named with the job identifier.
 A folder named “output” under each XC share, to serve as the default folder
for user’s output data files.
Output data is stored in the folder specified by the DryadOutputDir directive in
the DryadLINQ configuration file, DryadLINQConfig.xml. The default value for this
folder is XC\output.
 A folder named XCSetup under the head node’s XC share.
The installer places DryadHPCComputeNode.msi in XCSetup. If you chose to
install Dryad on the compute nodes, XCSetup also contains the cluster installation
logs generated by individual nodes.
The installer creates the shares and folders on all online compute nodes, and then it
copies the Dryad computation engine binaries to the compute nodes.
Important: The installer creates these shares on the drive specified for each compute
node, so make sure that the drive exists on all nodes.
After Dryad is installed, DryadLINQ users typically create their own folder under the
DryadData shares. A user’s folder is typically named with the user name. For example,
a user named DryadUser_1 would use the following folders on each node to store her
permanent data:
 User’s data would be placed on \\NodeName\DryadData\DryadUser_1.
 Intermediate files that the system generates when running a DryadLINQ
query would be placed in \\NodeName\XC\DryadUser_1\Job_ID. Job_ID refers to
the ID of the HPC job corresponding to the execution of the DryadLINQ query on
the cluster.
 The pieces of an output partitioned file would be placed under the path
specified by the PartitionUncDir configuration parameter. For example, if
PartitionUncDir is DryadData\DryadUser_1\Output, pieces of a partitioned file
will be saved under \\NodeName\ DryadData\DryadUser_1\Output on the
appropriate compute nodes. If the location of a partitioned file is not specified,
the metadata file for the partition will be created in the folder specified by
DryadOutputDir. To save partitioned files
under \\NodeName\DryadData\DryadUser_1\Output by default, one would set
DryadOutputDir to:
file:// \\NodeName\DryadData\DryadUser_1\Output.

Version 1.0.1 – November 10, 2009


© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 10

Note: The XC and DryadData shares are not affected when Dryad is de-installed. They
continue to be shared. Upon installation, existing shares are re-used. However, if the
shares were created on one disk (e.g. C:\XC and C:\DryadData) and a subsequent
installation uses another disk (e.g. D:\), then sharing on the C drive will be stopped
and new shares will be established on the D drive.

How to Use DryadHPCComputeNode.msi to Install Dryad Selectively


DryadHPCComputeNode.msi installs the Dryad computation engine on a selected
compute node. It is typically used to install the computation engine on compute
nodes that were offline when Dryad was originally installed or had to be taken offline
to be re-imaged.
Important: You must be a cluster administrator to run DryadHPCComputeNode.msi.

To install the Dryad computation engine on a compute node

1. Copy DryadHPCComputeNode.msi to the node.


2. Run DryadHPCComputeNode.msi by opening a command window with
administrative privileges and running the following command:
msiexec /i DryadHPCComputeNode.msi /l*v DryadHPCComputeNode_Install.log
This command creates an installation log in the head node’s XC\XCSetup folder.
3. Accept the EULA and then click Next to move to the Dryad Installation Options
page, as shown in Figure 5.

Figure 5. Dryad Installation Options page


4. Specify a drive for the data share.
5. Click Next to go to the completion page.
6. Click Install to install Dryad.
You can also use the Windows HPC clusrun tool to install the Dryad computation
engine from the command line, including installing the engine on multiple nodes. The
Version 1.0.1 – November 10, 2009
© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 11

following procedure installs the computation on a single node. For more details on
how to use clusrun, see the clusrun documentation in the Windows HPC Cluster
Manager help file, which is included with the Microsoft HPC Pack 2008 client utilities.
The following procedure shows how to install the binaries on a node. You can install
the binaries on the head node only if it is also a compute node.

To use clusrun to install the computation

1. Copy DryadHPCComputeNode.msi to the head node.


2. To install the computation engine, run the following command from a command
window with administrative privileges:
clusrun /outputdir:\\NodeName\XC\XCSetup\Logs msiexec /i \\NodeName\XC\XCSetup\
DryadHPCComputeNode.msi /l*v+ \\NodeName\XC\XCSetup\Logs\ClusterInstallation.log /quiet
INSTALLDIR=<XCDRIVE>
In this command line, replace HeadNode with the name of your cluster’s head
node, and replace <XCDRIVE> with the drive that contains the XC share.
For example, to install the binaries on a head node named MyHeadNode-01, run the
following:
clusrun /outputdir:\\MyHeadNode-01\XC\XCSetup\Logs msiexec /i \\MyHeadNode-01\XC\XCSetup\
DryadHPCComputeNode.msi /l*v+ \\MyHeadNode-01\XC\XCSetup\Logs\ClusterInstallation.log /quiet
INSTALLDIR=C:\

How to Specify the Data Retention Policy


Dryad applications often generate data files on the compute nodes that do not need
to be preserved. Dryad Management Tools automatically removes these files after a
time period specified in the data retention policy.
After you have installed Dryad on the cluster, you can use the Dryad Management
Console on the head node to specify the data retention policy.
Note: The Dryad Management Console is intended for the user who also installs the
management tools on the head node. When another user (with administrative
privileges) wants to run the Dryad Management Console, this user must follow these
setup steps:
i. <install_path>\Tools\ConnectionManagerEditor.exe –user –create
ii. <install_path>\Tools\ConnectionManagerEditor.exe –user –manage
iii. Provide the Server Name, select Trusted Connection and specify Database
Name. After saving the connection, mark the connection as the default
connection for the current user
iv. Close Connection Manager
v. Run Management Console

Version 1.0.1 – November 10, 2009


© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 12

To specify the data retention policy

1. To open the management console on the head node, from the Start menu, click
All Programs and run Microsoft Research DryadLINQ > Dryad Management
Console.
2. Click View Policies.
Figure 6 shows the System Retention Policy Details tool in the DryadLINQ
Management Console.
2. In the Policy Applicability section, specify the retention time.
Set Older than to the appropriate retention time. The default value is 72 hours,
which removes all Dryad files on the compute node data shares that are older
than 72 hours.
3. Specify the policy enforcement schedule.
The management console performs cleanup on a regular schedule. To specify a
cleanup schedule, set Check policy every to the appropriate value. The default
value is once every hour.
4. Specify the job synchronization schedule.
The management service stores Dryad job data in its own data store. To specify
how frequently the data store should be updated, set Synchronize jobs every to
the appropriate value. The default value is once every 5 minutes.
5. Specify the Dryad data store clean-up policy.
This setting specifies when the details of Dryad jobs that have been removed
from the compute nodes are to be removed from the Dryad data. Set the value of
Older than to the appropriate data-store retention time. The default value is 7
days.
6. Click Save to save the changes.
7. Restart the DryadDLMgmtSvc service.
The changes take effect only after you restart the DryadDLMgmtSvc service. The
display name of this service is Dryad + DryadLINQ Management Services.

Version 1.0.1 – November 10, 2009


© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 13

Figure 6. DryadLINQ Management Console, showing System Retention Policy Details

How to Install DryadLINQ on a Workstation


You implement and run DryadLINQ applications from a workstation with DryadLINQ
installed.

DryadLINQ Workstation System Requirements


You can use any workstation that is capable of running a version of Windows that
supports the Microsoft HPC Pack 2008 client utilities. DryadLINQ is tested on the
following Windows versions:
 Windows 7 Enterprise, 32-bit and 64-bit versions
 Windows Vista Enterprise SP1, 32-bit and 64-bit versions
 Windows Server 2008, 64-bit only
 Windows XP Professional SP3. 32-bit only

Note: Microsoft HPC Pack 2008 also runs on several other versions of Windows,
which will likely run DryadLINQ successfully. However, DryadLINQ has been
tested only on the versions in the preceding list.

You should install the following software:


 Microsoft HPC Pack 2008 client utilities, with Service Pack 1.

Version 1.0.1 – November 10, 2009


© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 14

These utilities are included with the Windows HPC Server 2008 package, so you
must obtain them from your cluster administrator. They are not included with
Microsoft HPC Pack 2008 SDK.
 Microsoft .NET Framework Version 3.5, SP 1
 Microsoft Visual Studio® 2008

How to Install and Configure DryadLINQ


The following procedure installs all the components that are required to develop and
run DryadLINQ applications.

To Install DryadLINQ

1. Copy the appropriate DryadLINQ installer to a convenient location on your


workstation’s hard drive.
 To install DryadLINQ on an x86 system, use DryadLINQ_x86.msi.
 To install DryadLINQ on an x64 system, use DryadLINQ_x64.msi.
2. Run the installer.
3. Accept the EULA.
4. Complete the installation wizard.
You can specify an installation folder if you prefer, but most users can simply click
Next and then click Install to install DryadLINQ to the default root folder, which
is:
C:\Program Files\ Microsoft Research DryadLINQ
Each DryadLINQ application must include a configuration file, DryadLINQConfig.xml,
which contains the information that is required to run jobs on the Dryad cluster.
Because most of this information does not vary from application to application, the
simplest approach is to create a global configuration file and point to it from your
project.
The global configuration file is also named DryadLINQConfig.xml, located in the
DryadLINQ root folder. The DryadLINQ installer places a template in the root folder,
which contains the following code:
<DryadLinqConfig>
<ClusterName>MyClusterName</ClusterName>
<Cluster name="MyClusterName"
schedulertype="Hpc"
partitionuncdir = "XC\output"
dryadoutputdir="file://\\NodeName\XC\output"
/>
</DryadLinqConfig>

To create a global configuration file

 Modify the italicized attributes in the DryadLINQConfig.xml template to


specify appropriate values for your workstation and cluster.
Version 1.0.1 – November 10, 2009
© 2009 Microsoft Corporation. All rights reserved.
Dryad and DryadLINQ Installation and Configuration Guide - 15

You should be able to use the default values for the other elements and attributes.
The attributes are as follows:
ClusterName Element
Change MyClusterName to the name of your cluster’s head node.
Cluster Element
Set the name attribute to the name of your cluster’s head node. This name and
the value of the ClusterName element must match exactly.
Set the dryadoutputdir attribute to a writeable share on one of the cluster’s
compute nodes by changing the value of NodeName. The value of of
partitionuncdir and dryadoutputdir are customizable.
For more discussion of DryadLINQConfig.xml, see “DryadLINQ Programming Guide.”
Most project-level DryadLINQConfig.xml files can simply point to the global file. The
following example shows the contents of a typical project-level DryadLINQConfig.xml
file. The DryadLinqRoot element specifies the DryadLINQ root folder, which contains
the global file. This example sets DryadLinqRoot to the typical root folder,
C:\Program Files\Microsoft Research DryadLINQ. Most DryadLINQ applications don’t
require any additional settings.
<DryadLinqConfig>
<DryadLinqRoot>
C:\Program Files\Microsoft Research DryadLINQ
</DryadLinqRoot>
</DryadLinqConfig>

Verify the Installation


After installing DryadLINQ, you should verify that Dryad is correctly installed and
running properly by compiling and running one of the DryadLINQ samples. For
details, see “DryadLINQ Programming Guide.”

Resources
The following list provides links to related information.
Dryad – Microsoft Research Project Page
http://research.microsoft.com/en-us/projects/dryad/
DryadLINQ – Microsoft Research Project Page
http://research.microsoft.com/en-us/projects/dryadlinq/
Dryad and DryadLINQ Manuals
Manuals and samples are provided with the DryadLINQ installation in the folder
<install_path>\Docs, including:
“Dryad and DryadLINQ: An Introduction”
“DryadLINQ Programming Guide”
Windows HPC Server 2008
http://www.microsoft.com/hpc/

Version 1.0.1 – November 10, 2009


© 2009 Microsoft Corporation. All rights reserved.

You might also like