Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

FoxTalk

Solutions for Microsoft® FoxPro® and Visual FoxPro® Developers

April 1999
Volume 11, Number 4

1 Turn Your VFP App Client/


Server: A 12-Step Program

Turn Your VFP 3


Jim Falino
Editorial: The Dismantling
of Microsoft

App Client/Server: 9
Whil Hentzen
Driving the Data Bus:
Third-Party Reporting Tools

A 12-Step Program 14
Andrew Coates
Solving a Mystery
Michael Levy

17 What’s Really Inside:


Jim Falino 6.0 The Basics of Buffering
Jim Booth

20 The Kit Box: Buffering:


Have any of your clients or your boss yet asked you, “What will it take to make this The Transaction Slayer
VFP application access a SQL Server back end?” Well, Jim’s been there and done Paul Maskens and Andy Kramek
that. So in this first of a two-part series, he’ll show you how to re-architect your 23 April Subscriber Downloads
application in order for it to access, literally, any ODBC-compliant data source. In
EA Best Practices: Seeing Patterns:
fact, with the proper planning and design, you’ll see how your application can Chain of Responsibility
even access Visual FoxPro tables as well. This will enable you to prototype your Jefferey A. Donnici
application against Fox data before deploying it on a remote data source. EA Reusable Tools: Building
a Builder Builder

T
HE “big question” was asked of me about three years ago and— Doug Hennig
fortunately—it was asked before I began writing even a single line of EA Visual Basic for Dataheads:
code. That was the good news. The bad news was that the same set of A Command and Function
code was expected to access native Visual FoxPro tables if a client didn’t have Summary—Part 2
Whil Hentzen
a need for the features of a powerful and much more expensive RDBMS, like
MS SQL Server. That might sound like a daunting task, but with the kind of
forewarning I was given, coupled with the flexibility of VFP, I was able to 6.0
satisfy the project’s requirements.
Applies to VFP Applies to Applies to Applies to
All it really takes is an understanding of how client/server applications v6.0 (Tahoe) VFP v5.0 VFP v3.0 FoxPro v2.x
work. If you can achieve that, then it’s just a matter of making sure that
everything you code will work under both scenarios. That being said, the goal
of this article isn’t just to explain how to make a VFP application access a SQL Accompanying files available online
at http://www.pinpub.com/foxtalk
Server database, but also to alert you to the pitfalls of file-server development
that would require a complete redesign.
Applies specifically to one of these platforms.
Continues on page 4
2 FoxTalk April 1999 http://www.pinpub.com
From the Editor FoxTalk

The Dismantling of Microsoft


Whil Hentzen

I
F you haven’t heard the news by now, you’ve probably company. And while Microsoft competes so hard as to be
been living in a cave, but I’ll repeat it for those few of overwhelming, it doesn’t have the blatant disdain for the
you who are unaware. The United States Department law that the Trusts of a century ago openly carried.
of Industry, Commerce, and Knowledge has prevailed However, the charges of monopolistic practices and
in their various legal machinations against our generous unfair bundling of products are very similar to those of
benefactor and beloved parent of Visual FoxPro— Roosevelt’s administration, and these must have formed
Microsoft Corporation—and has proposed a plan to the basis for the government’s decision on how to break
divide Microsoft into multiple businesses in order to up Microsoft.
alleviate the charges of monopolistic practices and unfair Pundits have guessed that the company would be
bundling of products. As FoxPro’s journal of record for divided into product lines, such as operating systems,
the past 11 years, it’s only appropriate to describe what home-based (or “edutainment”) products, development
that plan is and what it means to you. tools, and so on. This was a completely reasonable
First, I’m going to point out some background that assumption, of course, both because of the charges
will help you understand the reasoning behind the plan, that profits from one product or product group were
and to provide you with additional information that unfairly used to subsidize another product, and because
will help you make your own plans during these Microsoft’s actual organizational structure along those
tumultuous times. lines made it easy to see how the breakup would be done.
In the late 1800s and early 1900s, industrialists Continues on page 23
(what they called “businessmen” back then) would
use a combination of shrewd business knowledge,
unscrupulous practices, and blatantly illegal techniques
to form a company that dominated the industry it was
in. John D. Rockefeller, as many of you know, was
perhaps the most well-known of these industrialists,
having formed the world’s largest corporation and
effectively monopolized the domestic oil business. He
continued to buy competitors or put them out of business,
in all aspects of oil production, from discovery to refining
to distribution. This combination of a broad range of
companies was popularly referred to as a “Trust,” such
as in the “Standard Oil Trust.”
Teddy Roosevelt went after the Trust and broke it
up because of a personal blood feud with Rockefeller.
Standard Oil was divided into multiple companies, each
geographically restricted. Thus, we had Standard Oil of
Illinois, Standard Oil of New Jersey, Standard Oil of Ohio,
and so on. Some changed their names (SO-New Jersey to
Exxon, Standard Oil of Ohio to SOHIO, and then to
British Petroleum, and so forth).
Today, some aspects of the legal battle between
Microsoft and the government are similar, and some are
different. For example, the term “Trust” isn’t ever used
when referring to Bill Clinton’s administration or
Microsoft’s far-flung operations, and there isn’t any
personal animosity between the commander-in-chief and
the head of the world’s largest independent software

http://www.pinpub.com FoxTalk April 1999 3


A 12-Step Program . . . Endif
Return llMoved

Continued from page 1


I believe design pattern zealots call this a Decorator
If you’re in the same position I was, then this article pattern—a term that means to “pretty up” some
will come in quite handy. It will serve as a guide to unfinished code. Whatever you want to call it, it’s
developing a client/server application where VFP is used imperative to create “hooks” in your code and not to
only as the front end. (Actually, VFP could be used as the hard-code anything that might have to be dealt with in
middle tier as well, but to keep focus on the database, different ways depending on the back end. These
assume a two-tiered, or fat client, client/server model.) If techniques provide opportunities for you to write
you’ve already begun, or even finished, a file-server conditional code. The more data sources you intend to
application that needs to be upsized, this article will access, the more conditional code you might need.
enable you to determine how much work you have on Writing code in this manner will provide you with the
your hands. following benefits:
Some might be wondering why I’m not mentioning
• It creates a single point of maintenance if a change
the possibility of using a VFP back end through ODBC as
needs to be made.
well. I’ve tried that and found that the limitations of the
VFP ODBC driver made it unworthy of my effort. If • The same code can access multiple data sources.
you’re going to use VFP in a file-server environment, you
might as well enjoy all of its benefits and not be hampered • You can begin coding a client/server application
by driver limitations. using a VFP back end until a decision has been made
on using MS SQL Server, or Oracle, or Informix, or
It’s all about views and wrappers Sybase, or DB/2, and so on.
I’ve cataloged the issues you’ll need to be aware of into a
12-step program (everybody has one, why shouldn’t I?). Step 1. View-based data entry forms
There are two central themes running through the 12 When it comes to data entry forms for a VFP client/
steps—views and wrappers. If you can keep these two server application, you can use either SQL Pass-through
concepts in mind at all times, you can’t go wrong. By the (SPT), which is simply passing a SQL string through
way, the 12 steps aren’t in any particular order. I’ll cover ODBC to the data source, or remote views, meaning
steps 1-6 this month and steps 7-12 in Part 2 of this series. DBC-stored SQL statements whose resultant temporary
Views, as you probably already know, are nothing tables act much like native VFP tables. SPT will store its
more than SQL statements stored in a database container. result set in a local cursor, so you’d have to then loop
They come in two varieties—local views and remote through the controls on a form and bind each of them
views. Local views are used to access VFP data only. individually (for instance, thisform.txtCustomer.Value =
Remote views can be used to access any ODBC-compliant SQLResult.customer). Of course, updating the server
data source. VFP’s implementation of views allows you to would require you to create the appropriate SQL Update,
work with them in much the same way you work with Insert, and Delete statements yourself, not to mention the
native tables. As I’ll cover in detail in this article, making multi-user contention checking that must be coded as
use of them is a key to being back-end-independent. well—a rather tedious, error-prone task. This is the way
The other important concept is the use of wrappers. things are typically done in Visual Basic (so you know
By wrapper, I’m referring to the technique whereby a there must be a better way with VFP <g>).
function or command, like SKIP, is “wrapped” in some Remote views, on the other hand, are incredibly
protective code and housed in a user-defined function so simple to work with. All you need to do is create a
that each time it’s used, you don’t have to re-code the view with the same structure as the underlying table,
necessary error trapping. For instance, Function SafeSkip open the view (perform the SQL Select), manipulate
might look something like this: the local temporary table it creates in the same way you
would a native FoxPro table, then issue TableUpdate.
Function SafeSkip TableUpdate automatically creates the SQL Update,
LParameters cAlias
llMoved = .F. SQL Insert, and SQL Delete statements for you and
* Avoid that "EOF encountered" error passes them to the ODBC driver manager on your
If !EOF(cAlias)
Skip In (cAlias) workstation. The ODBC driver manager uses the ODBC
If EOF(cAlias) driver you’ve chosen to translate the SQL into a syntax
* Moved to EOF, retreat
Skip –1 In (cAlias) that the back end understands. It’s that simple. I
Else
llMoved = .T.
recommend using remote views for data entry forms
Endif for the following reasons:

4 FoxTalk April 1999 http://www.pinpub.com


• One code set. The same set of code can work against function SQLExec() is used to pass SQL statements
VFP tables or remote tables on a SQL Server. You’ll through to the server via ODBC. It returns -1 for an error,
just use local views or remote views accordingly. 1 for success, and 0 if an asynchronous query has yet to
complete. As an example, here’s how to get a cursor of
• Performance. In many ways, views can be faster than sales orders for a given customer using SQL Pass-through:
SPT because TableUpdate (a low-level VFP function)
automatically creates the SPT code for you. The time llSuccess = SQLExec(goEnv.nHandle, ;
it would take to create the SQL update strings "Select * From ORDERS Where Customer = ?cCustomer",;
"tcOrders") > 0
yourself and execute the code would most likely be
longer than TableUpdate. If you think about how If you’d employed a wrapper function for the SQL
much TableUpdate actually does—scanning the Select statements in PurgeOrders, then you could
change buffer, determining the type of update made, conditionally run it against data on a SQL Server, data
reading view properties, building the batch of SQL on a file server, or even data on the local workstation.
statements, passing them to the ODBC driver You should create classes of just such functions to make
manager, returning a success/failure flag, and data access back-end-independent. Here’s Function
clearing the change buffer—I’m sure you’ll agree that PurgeOrders rewritten in a client/server fashion:
it can outperform SPT.
Function PurgeOrders
• All of the above can be done without a syntax error. * This function purges (deletes) orders shipped
* prior to the date passed.
LParameter dShipDate
• Properties of views offer additional functionality over Local lcSQLStr
SPT. For example, to simulate a BatchUpdate count of lcSQLStr = "Select Count(*) as OrderCnt " + ;
"from Orders " + ;
five, you’d need to concatenate five SQL statements "Where ShipDate <= ?dShipDate"
with semicolons before passing them to ODBC. This.SQLExecute(lcSQLStr, "tcPurge")
If Reccount("tcPurge") = 0
lcMsg = "There are no orders to purge."
SPT does still have a very important place on a Else
lcSQLStr = "Delete from Orders ;
client/server application; I’ll discuss it more in Step 3. For Where ShipDate <= ?dShipDate"
more information on the basic usage of local and remote This.SQLExecute (lcSQLStr)
lcMsg = Trim(Str(Reccount("tcPurge"))) ;
views, refer to the VFP Developer’s Guide. + " orders were purged from the system."
Endif
MessageBox(lcMsg)
Step 2. All data-related commands should EndFunc
funnel through one function Function SQLExecute
About the worst thing you can do to make converting * Wrapper for VFP function SQLExec
LParameters cExecuteStr, cCursor
your application to a client/server design difficult is to Local llSuccess
hard-code data access commands like SQL statements, cCursor = Iif(PCount() = 1, "SQLResult", cCursor)
llSuccess = .T.
Zap, Pack, Reindex, Seek, and so forth, within your code. If goEnv.lRemote
In other words, does your code look like the following? llSuccess = (SQLExec(goEnv.nHandle, ;
cExecuteStr, cCursor) > 0)
Else && local, just macro expand cExecuteStr
Function PurgeOrders * Add VFP "Into Cursor..." clause
* This function purges (deletes) orders shipped If Upper(Left(cExecuteStr,6)) = "SELECT"
* prior to the date passed. cExecuteStr = cExecuteStr + ;
LParameter dShipDate " Into Cursor " + cCursor + " NoFilter"
Local lcMsg Endif
Select Count(*) as OrderCnt from Orders ; * This should be error trapped to return llSuccess
Where ShipDate <= dShipDate Into Cursor tcPurge &cExecuteStr
If _Tally = 0 Endif
lcMsg = "There are no orders to purge." Return llSuccess
Else EndFunc
Delete from Orders Where ShipDate <= dShipDate
lcMsg = Trim(Str(_Tally)) + ;
" orders were purged from the system." You’ll note the use of Reccount() vs. _Tally. SPT
Endif
MessageBox(lcMsg) cursors don’t update _TALLY, so you’ll have to forget
EndFunc using this pretty cool system variable. However, you
won’t have to worry about Reccount() containing deleted
These kinds of SQL statements are only good to records, since remote SQL data sources don’t have a two-
access tables that can be found along your path. In the stage delete. And if the back end was a VFP database, the
client/server world, you communicate with a data NoFilter clause would prevent VFP from creating a filter
source via a connection handle. For views, the Remote as opposed to a temporary table.
Connection <ConnectionName | DataSourceName> The second worst thing you can do to make
clause of the Create SQL View command enables a view converting your application to a client/server design
to access remote data. And for in-line SQL, the FoxPro

http://www.pinpub.com FoxTalk April 1999 5


difficult is to embed VFP functions in your SQL Selects. build a SQL statement like the preceding one, but the
However, if you have a wrapper like SQLExecute, you variable (in this case, cCustomer ) has an apostrophe in
have an opportunity to parse out the SQL statement it (for instance, “Jim’s”). Since its value is evaluated
before it executes and convert it to the back-end-specific before the string is sent to the server, you’ll end up with
syntax. For instance, the SQL Server function Convert() is mismatched quotes and an error at execution time. As you
used to convert data types. So, you could probably can see, the use of the “?” can really tighten your code, so
imagine what the code would look like to find “Str(” or I highly recommend its usage.
“Val(” in a string and StrTran() the Convert function in its For the preceding reasons, I recommend SPT instead
place. One last option is to remove the embedded function of views whenever you’re creating an ad hoc query
and make the SQL statement back-end-independent, then and/or don’t intend to update the data source. Many
use the VFP-specific function on the local cursor. people create views for reporting purposes, but for a
large system, I can’t see incurring the overhead of using a
Step 3. Use stored procedures view when a stored procedure or SPT SELECT statement
or parameterized queries can be deployed instead. (Unless you wanted to expose a
In terms of execution speed, there’s nothing faster than a server-side view to your users.) The client-side DBC of
stored procedure (SP). Since they’re precompiled and live views can get quite bloated with data it doesn’t need to
in the same building as the tables, what could be more maintain, since there are no view properties that really
efficient? The way that parameters need to be passed to need to be set.
back ends is often different, so, again, a wrapper function
comes in handy here. For example, notice the different Step 4. Find creative ways to limit result sets
syntax between VFP and SQL Server: In the client/server paradigm, record sets are transferred
from the server to the local workstation; thus, they should
SQLExec("usp_GetOrders('Acme')") && VFP be as small as possible in order to reduce network traffic.
SQLExec("usp_GetOrders 'Acme'") && SQL Server
Therefore, unlike the way the typical VFP file-server
All you’d need to do to solve this is enhance the application functions, forms should open without data.
SQLExecute wrapper function to handle stored That will also help maintain a snappy form load, even as
procedures. (I’ll leave that manageable task to you.) your data grows beyond the capacity of file-server
If you don’t want to go the stored procedure route, databases, like VFP.
there’s always parameterized queries stored in business And if you think about it, it does make sense to give
objects. With parameterized queries, you wouldn’t have users only what they’re looking for. Why present a user a
to worry about back-end-specific stored procedure call with grid that has 10,000 open sales orders when he or she
syntax, nor would you have to worry about rewriting can only work with one customer at a time? I think we
the stored procedures for each data source. The hit programmed like that for so long because incremental
you take in performance might be worth the boost search utilities made for quick navigation of even large
in maintainability. amounts of data. While that still might be acceptable for
You’ve already seen what a parameterized query small systems, the load would become unbearable on a
looks like—it’s just an SPT statement: network, and your application wouldn’t scale as data and
users grow.
llSuccess = SQLExec(goEnv.nHandle, ;

"Select * From Orders Where Customer ;


Parameterized views
= ?cCustomer", "tcOrders") > 0 So if your forms open without data in the client/server
world, how do we go about getting data? You need to
Notice the syntax of the filter condition: Customer = provide your users with a facility for entering filter
?cCustomer. That syntax is important because: 1) most criteria. These values will serve as parameters of the
back ends support it; and 2) it prevents you from having Where portion of the view’s SQL statement. You can
to build the parameters of the SQL statement into a back- provide this functionality in many different ways—your
end-specific character string: choices range from the very simple to the very complex.
A simple implementation is to have a form bound to a
Case lSQLServer
"...Customer = '" + cCustomer + "' And ; parameterized view. When a user enters Find mode, you
shipdate <= '" + DTOS(dShipdate) + "'" then provide a text box for entering the parameter value.
Case VFP
"...Customer = '" + cCustomer + "' And ; For example:
shipdate <= {" + DTOS(dShipdate) + "}"
* Create this view when you create the form.
Create SQL View vOrders As Select * From Orders
Another benefit is that you won’t run into the Where Customer = ?cCustomer
dreaded “apostrophe bug.” It occurs when you try to * Open the view (with no data) in the Data Environment

6 FoxTalk April 1999 http://www.pinpub.com


* or Load when you run the form. version in synch with the server version sounds like a
Use vOrders NoData
* Get the customer parameter in Find Mode nightmare to me. Therefore, I recommend one of the
cCustomer = thisform.txtCustomer.Value
Requery() && retrieve orders for cCustomer
other options.

A second technique that would provide much more One-to-many forms


flexibility is to provide a Query-By-Form (QBF) facility. One-to-many forms should be handled as follows.
QBF means that in Find Mode, you provide for the user a Whatever technique you’ve decided to use for single-table
way to enter filter criteria into any (or just most) of the forms should also be used for the header portion of a
controls on a form. You then build a SQL Where string one-to-many form. The views of child tables should be
accordingly and use it in one of two ways: parameterized on the unique ID of its parent. So, when
you navigate from one header to another, the child view
1. Build an SPT statement to create a record set of the just needs to be requeried. Using this method, only the
important fields and the unique ID. Create a view children of the current header need to be brought locally.
that’s parameterized on a unique ID. Users can
browse the SPT cursor to find the data they need, and Step 5. Keep views in their own DBC
you can requery the view based on the unique ID. to treat the back end independently
If, in your framework design, views are maintained in
2. You can also use the QBF-generated SQL string to
their own DBC, it will make the transition to other back
recreate the remote view definition on the fly. Since
ends more seamless. In other words, when you want your
your DBC of views is on the workstation, there’s no
application to access a different data source, you want to
contention issue here. (More on this in Step 5.)
keep as many pieces of your framework still intact.
Keeping your views in a separate DBC from the tables
(in a VFP back-end scenario) will make accessing a
remote data source a matter of flipping a switch. The
Tip! implementation could be as follows:

There are at least two third-party tools available for


• Create a directory called \VFPViews, and store a DBC
of local views in it. Name the DBC AppViews.DBC
implementing Query-By-Form in VFP:
(\VFPViews\AppViews.DBC).
• QBF Builder (http://www.hallogram.com/qbfbuilder)
• Stonefield Query (http://www.stonefield.com) • Create a separate directory for each data source
you wish to access, keeping the view’s DBC name
the same. In other words, the DBC \SSViews\
AppViews.DBC would contain views that access a
Handling controls bound to look-up tables SQL Server. Since there are often several places in
This concept of limiting result sets should be applied to your code where you’ll refer to the views database,
controls as well. What good is limiting sales orders to this will prevent your having to write conditional
one customer if you’re going to populate several combo code per back end.
boxes on your form with look-up table data? The several-
thousand-record list box or combo box control has no • In an initialization file of your choice, store
place in a client/server application. There are at least two information like the ODBC DSN name, the
ways of getting around this problem. One is to not use directory path where the views are stored, a flag
these kinds of controls on large tables and instead use a indicating whether you’re accessing local or remote
text box. Then just launch a separate form with a grid data, and, if local, the directory path of where the
upon the user choosing a hot key. VFP tables are stored.
If you insist on using multi-record controls, you can
• Set Path To... the appropriate views directory on
try a second technique. Don’t populate the control until
application startup.
the GotFocus event is fired. This late-binding technique
will save you the overhead of populating unused • If it’s a VFP back-end, Set Path To... the network
controls. You can also use a combination of each of these directory where the VFP tables are stored.
techniques depending on the anticipated size of the look-
up table, and just control it with a property and/or While it’s true that you could add the name of the
options table. views database to the initialization file as well, often
The VFP Developer’s Guide offers yet another this is already hard-coded throughout an application.
suggestion: copying often-used lookup tables to each Having a separate DBC of views will also keep the size
workstation. However, knowing when to keep the local of it manageable. If the back end was a VFP database,

http://www.pinpub.com FoxTalk April 1999 7


combining them would mean that twice the number of index it. So, feel free to index a view’s cursor as much as
database objects would need to be stored. If such was the you’d like if you feel you need the speed. But I’d still stick
case, Modify Database could take forever on a large with Locate over Seek. Since Locate is also Rushmore-
system. You’ll also notice that the view’s DBC (regardless optimizable, using it with or without an index won’t
of whether it contains local or remote views) is kept fail—one will just be marginally faster than the other.
locally, not on a network to be shared by all users. Using
this technique will allow for a lot of user-specific Conclusion
flexibility, as you’ll learn later in the article. Well, we’re only halfway there, but already I’m sure
This whole piece feels like it’s assuming a local store you can see the writing on the wall. Wrappers and views
as well, but it’s not explicitly said. are certainly the keys to making your code back-end-
independent. Keeping those concepts in mind at all times
Step 6. Use Locate rather than Seek will give you the leverage you need to make a flexible,
for local data navigation scalable client/server application. Next month, I’ll cover
Since all data is accessed via SQL statements in a client/ six more steps and close with a sample form that can
server application, you lose the ability to use every VFP access a VFP database as well as a SQL Server database.
developer’s best friend—Seek. Wipe those tears, there’s Then, you’re on your own. See you next month. Until
an alternative: Locate. Since local result sets will be (by then, start wrapping. ▲
client/server definition) relatively small, the Locate
command will more than suffice for moving the record
Jim Falino has been happily developing in the Fox products since 1991,
pointer in a local cursor or temporary table. beginning with FoxBASE+. He’s a Microsoft Certified Professional in
Here’s even better news. The result set of a view is Visual FoxPro and the vice president of the Professional Association
actually a temporary table. To see what I mean, open any of Database Developers (PADD) of the New York Metro Area. For the
view and check out the return value of ? DBF(). You past three years, he’s been a project manager, leading the development
should see something like C:\WINDOWS\TEMP\ of a very large client/server apparel-manufacturing application for
76633985.TMP. So this “thing” actually has a disk the GARPAC Corporation using Visual FoxPro as a front end to any
presence. And in VFP, if it has a disk presence, we can ODBC-compliant SQL back end. jim@garpac.com.

8 FoxTalk April 1999 http://www.pinpub.com


Driving the Data Bus FoxTalk

Third-Party Reporting Tools


Andrew Coates 6.0

Getting the data into an application’s tables is only half the analysis and reporting tasks much better, faster, and more
battle. Some even say it’s the easy half. In the second column efficiently than the Fox ever will. This isn’t a reflection on
of this series, Andrew discusses some of the third-party tools FoxPro; it’s simply an acknowledgment of the fact that it’s
you can use to convert this data into valuable information. much more efficient to use the tools that were designed
for a task to do that task.

L
AST month, I reviewed VFP’s native reporting Some examples of analysis that can be done better by
tools—SQL, the Report Writer, and the cross-tab other tools instead of FoxPro are shown in Table 1.
generator (see “Reporting—Converting Data into There are tools that do some reporting tasks better
Information”). This month, I’ll concentrate on some of than FoxPro, too. Examples are listed in Table 2.
the third-party reporting tools that are available and I’m going to cover analysis and reporting in separate
compatible with VFP. I’ll discuss interfacing with Excel, sections, and I’ll give some examples of each of the tasks
getting your reporting output onto the Web (and a in Tables 1 and 2.
number of other output options), and putting your data
onto a map. Principles of using third-party tools
When using third-party tools from FoxPro, there are three
Demonstration data set things I try to do:
To make this discussion more practical, I’ve developed a
data set I’ll use throughout this series. The database 1. Minimize the inter-application interface;
container and the associated tables are included in this 2. Utilize the strengths of each tool; and
month’s Subscriber Downloads at www.pinpub.com/ 3. Modularize the tools used.
foxtalk. The data set represents a sales contact
management system for a fictitious company, Maroubra Minimize the interface
Mapping (MM). There are tables for the company One of the least efficient aspects of any multi-tool
details, address details, and contact (employee) details, application is the interaction between the tools. Wherever
and a call table that records all interactions between possible, keep the actual calling of one tool’s functions
MM and contacts. There are also lookup tables for some from another to a minimum. Usually, the best way to do
of the fields. this is to write macros or programs in the tool you’re
controlling and just initiate those macros from the tool
Why use third-party tools? you’re using to do the controlling.
I spent much of last month’s article telling you how
wonderful FoxPro’s reporting features are. Why should I Utilize strengths
now spend this month telling you how to use other tools, This might sound obvious, but the reason you’re using
which are both less well-integrated with VFP and an multiple tools is to get the best out of each of them.
additional expense? The simple answer is “horses for Do data handling in FoxPro—it’s very good at it. Do
courses.” There are tools available that will do some numerical modeling in something else. Make sure

Table 1. Examples of analysis tasks that can be done better Table 2. Examples of reporting tasks that can be done better
by a third-party tool. by a third-party tool.

Task Tool Task Tool


Specialized financial analysis Tool with financial functions Multiple child reports Tool allowing sub-reports
(for example, Excel) (for example, Crystal Reports)
Geographic analysis Geographic Information System Geographic reporting (mapping) Geographic Information System
(for example, ArcView) (for example, ArcView)
Computationally intensive Custom-written numerical model Time-based reporting Tool capable of displaying animation
numerical modeling (for example, Storm Water (for example, PowerPoint)
Management Model)

http://www.pinpub.com FoxTalk April 1999 9


you’re using a hammer for the nails and a screwdriver are some tasks that are better delegated to a third-party
for the screws. tool. Often these are computationally intensive tasks that
aren’t a part of the native FoxPro language, such as
Modularize complex financial analysis and numerical modeling.
Another great advantage of building a system with third- Another task for which FoxPro isn’t natively equipped is
party tools is that when a better solution comes along, geographic analysis.
you can replace a component of your application and
your application will be better. This is easiest if you have Financial analysis
a central hub controlling each of the third-party tools A common computationally intensive task is financial
rather than chaining a whole lot of tools together in calculations. FoxPro offers very little in terms of native
sequence. I like to use FoxPro as this hub and call each financial functionality—the NPV, or Net Present Value,
of the other tools from there. clause of the CALCULATE command is the only one
that springs to mind. Excel, on the other hand, has more
Controlling third-party tools than 50 financial functions. These range from ACCRINT
Using the native FoxPro analysis and reporting functions (returns the accrued interest for a security that pays
is relatively simple. Because they’re built into FoxPro, periodic interest) to YIELDMAT (returns the annual
there are FoxPro commands and functions that call them yield of a security that pays interest at maturity). You
and control them. Using third-party tools poses more of could write any of these functions in FoxPro, but the
a problem, however. These tools weren’t necessarily computational performance would make this unworkable
developed with FoxPro in mind. for all but the most trivial applications.
Anyway, there’s no need to rewrite any of these
OLE functions in FoxPro if there’s a copy of Excel on the target
Fortunately, many of the modern Windows-based tools machine. For example, Excel provides a function (IPMT)
(including VFP) use the OLE standard for inter-process that calculates the interest paid in a given period of a loan.
communication. This means that as long as you know If you want to know how much interest you’ll be paying
what the tool is registered as and what its object model is, in the 45th month of a five-year, $145,000 loan at a
you can control the tool from within VFP. percentage rate of 6.6%pa with monthly payments, use
I’m not going to discuss the mechanics of OLE the following code:
Automation any further in this column, but you can study
the sample files provided with VFP (from version 3 up) oExcel = create('excel.application')
? oExcel.worksheetfunction.IPMT(0.066/12,45,60,145000)
for some examples of how to control Word and Excel. * prints -797.50
release oExcel

DDE
In other words, $797.50 of your payment will be
Older Windows-based tools might not be OLE servers,
interest. If you want to know what the entire payment
but they might be able to communicate through
will be (capital and interest), use the Excel function PMT:
Dynamic Data Exchange (DDE). An extension of this
technology is NetDDE, which allows communication oExcel = create('excel.application')
between applications across a network. There’s an ? oExcel.worksheetfunction.PMT(0.066/12,60,145000)
* prints -2843.8882065830
example of calling a DDE server from FoxPro in the release oExcel
“Mapping” section.
In other words, each payment will be $2843.89.
DOS run Note the syntax you use here. First, create an instance
Sometimes, there might be no way of directly controlling of the Excel OLE Automation server. Next, use Excel’s
your third-party tool from VFP. One analysis package I worksheetfunction property to call the PMT function.
use for storm water modeling is a DOS-based system that You get the parameters for the various functions from the
requires its own data files and a configuration file. The Excel Help files. Finally, release the instance of Excel from
way I “control” that tool is to use VFP to extract the data I memory. Of course, if you’re calling lots of Excel functions
need and then write the data and configuration files using from your application, just create the Excel instance when
TEXT...ENDTEXT blocks. I can then run the model from a you first need it and release it after you last need it.
DOS prompt and use the files I’ve generated. Of course, to Using this technique has the additional distinct
use the output from the model, I need to read it back into advantage that you can be fairly confident that the
VFP and send it to a reporting package. calculations are correct (although, given the recent
spate of discoveries of recalculation bugs in Excel,
Data analysis some people might dispute this <g>). There’s no need
While FoxPro does simple data analysis very well, there to rewrite functions that are already available and

10 FoxTalk April 1999 http://www.pinpub.com


debugged. Listing 1 shows a program that prints a fixed in a fast, highly optimized programming language.
interest loan payment and interest schedule using these However, these applications generally also require large
two Excel functions. amounts of data to calibrate and also require verification.
We happen to have a system that handles and formats
large amounts of data fast and efficiently (this sounds like
Listing 1. Using Excel to do financial analysis.
a job for the FOX!).
* Program....: PAYSCHED.PRG Many numerical modeling tools are very task-
* Version....: 1.0 specific, and as such they don’t have fancy interfaces
* Author.....: Andrew Coates
* Date.......: January 30, 1999 like OLE Automation—there just wouldn’t be the return
* Notice.....: Copyright © 1999 Civil Solutions,
* All Rights Reserved.
to the developers for including these abilities. Many of
* Compiler...: Visual FoxPro 06.00.8167.00 for Windows these tools have an ASCII data file input format, as well
* Abstract...: Prints a payment schedule for a fixed
* rate loan given an annual interest rate, a number as a configuration file from which all of the parameters
* of years over which the loan is taken, a loan and control values are read. Preparation of these files is
* amount, and a number of payments per year.
* Uses the Excel functions PMT and IPMT. often the most time-consuming and error-prone task in
* Changes....:
using these tools, so any automation of the process is a
local loExcel, lnAPR, lnLoanYears, lnLoanAmount, ; great boon.
lnPaymentsPerYear
Using FoxPro’s text handling functions, you can
* Set up values for calculations. In practice, you'd create perfectly formatted ASCII files from your data.
* pass these as parameters or have them as properties
* of an object. Listing 2 shows an example taken from a program that
lnAPR = 7.5
prepares an input file for the U.S. EPA’s Storm Water
lnLoanYears = 10 Management Model (SWMM).
lnLoanAmount = 250000
lnPaymentsPerYear = 2

loExcel = create('excel.application') Listing 2. Preparing input files for a numerical model.


* Calculate the periodic payment.
lnPayment = loExcel.worksheetfunction.pmt( ; * Make sure we've got textmerge turned on.
(lnAPR/100)/lnPaymentsPerYear, ; lcTextMerge = set("textmerge")
lnPaymentsPerYear * lnLoanYears, ; set textmerge on
lnLoanAmount)
* Open the output file for writing (we'll use
* Round the periodic payment to the nearest cent. * textmerge).
lnPayment = round(lnPayment, 2) set textmerge to (lcDataPath + lcOutFile) noshow

* Clear the screen. * Check that we've opened it sucessfully.


clear if _TEXT < 0
wait window "Could not open the output file for " + ;
* Set up the headings. "writing" + chr(13) + PAKTC
? 'Period', 'Payment', 'Interest' return
endif
* Loop through the periods and calculate the interest
* paid in each. * Write the program control details.
FOR lnPeriod = 1 to (lnPaymentsPerYear * lnLoanYears)
lnInterest = loExcel.worksheetfunction.ipmt( ; * Write the block header - $RUNOFF.
(lnAPR/100)/lnPaymentsPerYear, ;
lnPeriod, ; \*===============================================
lnPaymentsPerYear * lnLoanYears, ; \$RUNOFF
lnLoanAmount) \*===============================================
\
* Round the interest paid to the nearest cent.
lnInterest = round(lnInterest, 2) * Write the title lines.

* Print the info for this period. \*===============================================


? transform(lnPeriod), transform(lnPayment, '@$'), ; \* Title Lines (A1)
transform(lnInterest, '@$') \*===============================================
\
ENDFOR \A1 '<<alltrim(lcTitle1)>>'
\A1 '<<alltrim(lcTitle2)>>'
\
Numerical modeling
* Write the B lines (program control).
You might be involved in an area that requires numerical
modeling. For example, many engineering applications \*===============================================
\* Program Control Lines (B1-B4)
use a technique where equations that would otherwise be \*===============================================
\
deemed insoluble are conquered by successively refined \* METRIC ISNOW NRGAG INFILM KWALTY IVAP NHR NMN NDAY
approximation. Others estimate the state of a system over \\ MONTH IYRSTR
\B1 <<lcMetric>> <<lcIsnow>> <<lcNrgag>> <<lcInfilm>>
a period of time by using the output from a set of \\ <<lcKwalty>> <<lcIvap>> <<lcNhr>> <<lcNmn>>
equations as the input to the same set of equations at \\ <<lcNday>> <<lcMonth>> <<lcIyrstr>>
\
small time steps throughout the period. These recursive or \* IPRN(1) IPRN(2) IPRN(3)
\B2 <<lcIprn1>> <<lcIprn2>> <<lcIprn3>>
large sequential techniques are obviously very \
computationally intensive and as such are best written \* WET WET/DRY DRY LUNIT LONG

http://www.pinpub.com FoxTalk April 1999 11


\B3 <<lcWet>> <<lcWetDry>> <<lcDry>> <<lcLunit>> Crystal Reports. Version 7 has just been released and is a
\\ <<lcLong>>
\ very impressive package. There’s an OLE Automation
* ... stuff omitted server and an ActiveX control, so you can ship the
runtime files with your application and your client
select (lcRainGaugeTable)
index on (lcTimeField) tag (lcTimeField) addi doesn’t even need a copy of Crystal Reports. There’s a
go top rich object model that allows programmatic access to a
ltStartTime = eval(lcTimeField)
scan wide range of reporting options.
lnTimeStep= (eval(lcTimeField) - ltStartTime) / ; Crystal Reports supports a much wider range of
iif(lcKtime = "0", 60, 1) output options than FoxPro does. The destinations
\E3 <<alltrim(str(lnTimeStep))>>
include (but aren’t limited to):
for lnRainGaugeNumber = 1 to alen(laRainGaugeField)

\\ <<eval(laRainGaugeField[lnRainGaugeNumber])>> • Exchange folders


• Lotus Notes servers
endfor
• ODBC data sources
endscan
• Word
* remainder omitted • WordPerfect
• Excel
The FoxPro text output functions are fantastic. • 123
They allow recursive embedding of variables, so you • E-mail
could store all of your file templates in memo fields of
a table and use the <<>> evaluation symbols to expand Crystal Reports also does reporting things that
the contents of the memo field, which could, in turn, you just can’t do with the FoxPro report writer. My
include <<>> placeholders to allow customization of favorite example of this is sub-reports. One common
the particular instance. I’d urge you to investigate this application structure has the company as the parent
often-overlooked feature if you ever need to produce table and two child tables—the company’s addresses
ASCII text files. and the company’s employees. In FoxPro, there’s no
way to list the addresses and the employees of each
Geographic analysis company in the same report. With Crystal Reports, there
Increasingly, users are becoming more interested in the is. You can include a sub-report in the detail band of the
geographic locations their data represents. Examples main report that essentially runs a parameterized query
include the suburbs in which customers live for against the child tables based on the current value of the
marketing, the rate of crime for a particular location for parent table’s primary key. This means that only the
rating an insurance risk, and many other applications. appropriate child table records are printed for each parent
You’ve probably got a postal code stored with each record, and as many child tables as you want can be
address in your customer table, and you can use this, for included in the report.
example, to determine which regions are generating the You can design reports using the report designer that
highest sales at each store. ships with Crystal Reports. Calling them from your
FoxPro is good at answering questions like “how application is simple using OLE Automation. Listing 3
many sales were from one of the following list of postal shows a VFP program that starts a Crystal Reports
codes?” but isn’t nearly as good at “how many sales were instance, opens a report, refreshes the data in the report,
made to clients living within 30 kilometers of any of our and prints it out, all from within VFP. Note that if you
17 stores?” There are tools that are good at this. They’re want to run this program, you’ll need to create an ODBC
called Geographic Information Systems (GIS), and they’re DSN called DataBus that points to databus.dbc, included
database systems that are optimized for geographic data. in this month’s Subscriber Downloads. You’ll also need
Some GIS even have an extended SQL that allows you to the runtime for Crystal Reports 7.0.
answer queries such as the one posed previously with one
line of code. See the “Mapping” section on the next page
Listing 3. A VFP program that controls a third-party reporting
for an example of controlling a GIS from FoxPro.
tool (Crystal Reports) through OLE Automation.

Advanced reporting * Program....: CRSIMPLE.PRG


There are some things that the FoxPro reporting system * Version....: 1.0
* Author.....: Andrew Coates
just can’t do as well as some of the third-party * Date.......: January 27, 1999
* Notice.....: Copyright © 1999 Civil Solutions,
reporting tools. * All Rights Reserved.
* Compiler...: Visual FoxPro 06.00.8167.00 for Windows
* Abstract...: Demonstrates running a Crystal Report
Crystal Reports * from within VFP using OLE Automation
Arguably, one of the best third-party reporting tools is * Changes....:

12 FoxTalk April 1999 http://www.pinpub.com


* for the MB_ constants If nChan < 0 Then
#INCLUDE foxpro.h MessageBox ([Could Not Connect to ARCVIEW, ] + ;
[Please ensure that it is running], ;
local loCRApp, loCRReport MB_ICONEXCLAMATION)
Return
* Start a Crystal Reports instance. EndIf
loCRApp = create('Crystal.CRPE.Application')
* Build the ArcView command.
* Check that it started OK. cAVCOMMAND = [av.run('SinglePoint', {]
if type('loCRApp') = "O" and ! isnull(loCRApp)
cAVCOMMAND = cAVCOMMAND + ;
* Open a report. ltrim(Str(patientdata.MapNumber)) + [,] ;
loCRReport = .null. + ltrim(Str(patientdata.xValue)) + [,] ;
loCRReport = loCRApp.OpenReport(sys(5) + sys(2003) ; + ltrim(Str(patientdata.yValue)) + [,] ;
+ '\mm1.rpt') + [']+ ltrim(patientdata.nodes) + [','',] ;
+ ['] + ltrim(patientdata.patientname) + [, ] + ;
else upper(ltrim(patientdata.gender)) + ['})]

messagebox('Could not initialize Crystal Reports', ; * Now, run the script.


MB_ICONSTOP, 'Error') cRetVal = DDEExecute(nChan, cAVCOMMAND)
return .f.
* Check to see how the run went, and display an error
endif * if it failed.
If Left(cRetVal, 3) <> [Lay] Then
* Check that the report was loaded OK. msg = [Print Failed - Layout retained as ] + ;
if type('loCRReport') = "O" and ! isnull(loCRReport) Trim(cRetVal) + ;
[. Please print and delete it manually]
* Refresh the data in the report.
loCRReport.Database.verify() answer = MessageBox(msg, MB_ICONEXCLAMATION, ;
"Error Printing")
* Print it out.
loCRReport.PrintOut(.f.) EndIf

else * Terminate the DDE conversation with ArcView.


DDETerminate(nChan)
messagebox('Could not open Report', ;
MB_ICONSTOP, 'Error')
Notice that this program only actually communicates
endif
with ArcView once (apart from initiating and terminating
the DDE conversation). FoxPro does the data handling,
Mapping
and then the ArcView script SinglePoint does the
FoxPro’s native reporting capabilities don’t extend to
reporting work.
mapping. If you need a picture of where things are, you
need to use a third-party tool. Using a GIS, you can lay
Putting it all together
out and visualize the position of objects relative to other
Using FoxPro as the central application, it’s possible to do
objects. You can produce the picture that’s worth a
some pretty complex analysis and reporting. One project I
thousand data points. ArcView is an industry-standard
worked on was the modeling of an urban storm water
desktop GIS. ArcView doesn’t have a fancy OLE
catchment to
Automation interface, but it can be controlled via DDE.
predict the effect
Listing 4 shows a program I used to produce a report
of different
showing where a melanoma was on a patient and the
development
lymph node fields it drained to.
options on the
catchment
Listing 4. Controlling a desktop GIS from FoxPro. response. The
project was
* Program....: RUNARCVIEW.PRG
* Version....: 1.0
broken up into
* Author.....: Andrew Coates two parts,
* Date.......: February 1, 1999
* Notice.....: Copyright © 1999 Civil Solutions, analysis and
* All Rights Reserved. reporting. The
* Compiler...: Visual FoxPro 06.00.8167.00 for Windows
* Abstract...: Calls an ArcView script via DDE to print basic layout of
* the information for this patient
* translated from VBA the components
* Assumes...: patientdata table open and record pointer is shown in
* on correct record
* Changes....: Figure 1.
In the
#INCLUDE foxpro.h
analysis section,
* This module sends data to ArcView via DDE to get it
* to print a map. It passes the map number, randomized non-technical
* x and y values, the node fields comments (blank), and users alter the
* the patient name and sex. Figure 1. Component layout for an
land use of
* First, establish a connection to the DDE server.
analysis and reporting system using
nChan = DDEInitiate([ARCVIEW], [SYSTEM]) third-party tools. Continues on page 19

http://www.pinpub.com FoxTalk April 1999 13


FoxTalk

Solving a Mystery
Michael Levy 6.0

“Curiosity killed the cat” is how the saying goes. In this article, what event data to capture, and any filters that should be
Michael continues his series on Visual FoxPro and SQL Server applied to limit the data that’s collected.
by demonstrating some of tools that we can use to spy on the There are two ways to create a Trace. The easiest way
communications between VFP and SQL Server 7.0. Let’s hope is to use the Create Trace Wizard. The wizard takes the
that Michael doesn’t come to the same end as the cat. approach of solving some common problems like finding
the worst performing queries or identifying the cause of a

I
think that a lot of people are like me and are curious deadlock. The other way to create a Trace is to use the
about how things work—not all things, though, just Trace Properties dialog box. The Trace Properties dialog
computer things. I feel that if I have an understanding box can be opened by clicking on the toolbar button or
of how it works, I have an advantage over the machine. selecting File | New | Trace. The Trace Properties dialog
Also, as an instructor, the “how does it work” questions box contains four tabs: General, Events, Data Columns,
are very common. and Filters.
The General tab is used to capture information like
The mystery the name of the Trace, its type, the server to monitor, and
The mystery that I decided to tackle is the NODATA whether to record the captured information to a file and/
option of the USE statement. How does VFP implement or table. A Trace can be one of two types: Private or
the NODATA clause? For those of you who might not be Shared. A Shared Trace is available to anyone using the
familiar with the NODATA option, it instructs VFP to computer where the Trace was originally defined and is
open a view (either local or remote) but not to fill the stored in the Registry under the key HKEY_LOCAL_
cursor with any data. MACHINE\SOFTWARE\Microsoft\MSSQLServer\
I have no way to peer into the inner workings of SQLServerProfiler\Client. A Private Trace is only
VFP. But I do have tools that will allow me to monitor available to the person who created it. A Private Trace is
the communications between VFP and SQL Server. also stored in the Registry under the key HKEY_
Somewhere in those communications lies the answer. CURRENT_USER\SOFTWARE\Microsoft\MSSQLServer\
Somewhere in those communications, VFP is telling SQL SQLServerProfiler\Client.
Server to run a query (or at least evaluate it) and return The Events page is where you specify the events that
the structure of the result set. you wish to monitor and record. An event is some action
that’s occurred on the server, such as a connection being
The SQL Server Profiler made or broken, a stored procedure being executed, or a
The SQL Server Profiler is an external tool that ships with batch of SQL statements being executed.
SQL Server 7.0 and is used to monitor and record activity The Data Columns page has two uses. The first is to
(called a “Trace”) that occurs on your SQL Server. Events, specify what information to capture about each event.
such as the submission of ad hoc queries and the The type of information will depend on the events that
execution of stored procedures, are displayed on the you’re monitoring. For instance, if you’re monitoring
screen and/or optionally written to a file or a SQL Server SQL:StmtCompleted, the Text Data column will contain
table. A saved Trace can be replayed at a later point in the text of the statement that was executed, and the
time or used by the Index Tuning Wizard. The Index Integer Data column will contain the number of rows
Tuning Wizard will evaluate the queries submitted to the return by the statement. The second use is to specify a
server against the existing indexes and recommend grouping for the captured information. You might want
changes to the indexes to improve query performance. to group all the events by connection. Without a grouping,
We’ll use the Profiler to capture the commands that all events will be listed in the order that the Profiler
VFP sends to SQL Server when it wants to open a remote captured them.
view using the NODATA clause. The final page is the Filters page. Filters limit the data
collected by the Profiler. There are three types of filters.
Creating a Trace Value filters allow you to specify only one value to
Before you can make use of the Profiler, you have to create include in the captured information. For instance, you can
a Trace. A Trace tells the Profiler which events to look for, specify the Connection ID to include. Range filters allow

14 FoxTalk April 1999 http://www.pinpub.com


you to specify a range that a value must fall within in *-- The connection
CREATE CONNECTION cnMystery ;
order to be included in the captured information. An CONNSTRING "Driver=SQL Server;Server=merlin;" + ;
Database=pubs;Trusted_Connection=Yes"
example would be the CPU filter, which allows you to
specify the minimum and/or maximum CPU usage *-- The remote view
CREATE SQL VIEW v_authors REMOTE CONNECTION cnMystery ;
in milliseconds. AS ;
Include/Exclude filters allow you to specify a SELECT * FROM authors

discrete list of values that the captured data must have, or


must not have, to be included in the Trace. Application The connection that I’ve created here is called a
Name is an example of this type of filter. By default, the “DNSless” connection. I’ve supplied all the parameters
Application Name filter will be configured to exclude that ODBC requires to establish the connection, without
applications that have registered a name beginning with using the ODBC Administrator to create a DSN.
“SQL Server Profiler.” I like to include SQLAgent to the
list of excluded applications by changing the exclude Configuring SQL Server Profiler
option to “SQL Server Profiler%; SQLAgent%”. Setting up the Profiler requires that we configure and start
a Trace. Create a new Trace by selecting the New Trace
Running a Trace toolbar button. The Trace Properties dialog box will
Once the Trace is defined, you can start it by selecting the appear. Create a Private Trace named Mystery. On the
Start Traces toolbar button and selecting your Trace from Events page, select the following events: SP:Completed,
the list. After the Trace has been started, you’ll see a RPC:Completed, and SQL:StmtCompleted (see Figure 1).
window with two panes. The top pane contains a list of On the Data Columns page, keep the columns that were
all the recorded events; the bottom pane contains a copy selected by default. On the Filters page, add ‘Microsoft%’
of the data return in the Text Data column. to the Include option for the Application Name filter (see
Figure 2).
Configuring VFP
On the VFP side, we need very little—just a database Running the experiment
container, a connection, and a remote view. You can use Start the filter if it didn’t start automatically. Switch to
the following VFP commands to create everything: VFP, and execute this command:

*-- The database container OPEN DATABASE Mystery


CREATE DATABASE Mystery USE v_authors NODATA
SET DATABASE TO Mystery USE

Figure 1. The Events page of the Trace Properties dialog box. Figure 2. The Filters page of the Trace Properties dialog box.

http://www.pinpub.com FoxTalk April 1999 15


You’ll only need to open the database if it’s not
already open. Then switch back to the Profiler and stop
the Trace.

The results
Figure 3 shows the results from my computer. I’ve
expanded the event of interest; it’s the first
RPC:Completed event. The Text Data column shows
the batch that was sent to SQL Server. It’s a call to the
sp_prepare system stored procedure. Figure 3. The results of a SQL Server Profiler Trace.
If you were to look up sp_prepare in the SQL Server 7.0
Books Online (BOL), you’d find it mentioned in two places.
The first is with a discussion of NT Performance Monitor started the Dynamic ODBC Tracing instead of the SQL
objects and counters. The second is with the system stored Server Profiler (see Figure 4).
procedures—at the bottom. As it turns out, sp_prepare isn’t
well-documented. It’s meant to be called from OLEDB Finally, the answer
Providers and ODBC drivers only. Figure 4 contains the call that VFP makes to prepare the
I’m still not satisfied. I still don’t understand how SQL statement. I looked up SQLPrepare() in the ODBC
VFP is getting the information that it needs to construct Programmers Reference and found the following line:
the result set. There’s one more tool that might help with “Once the application prepares a statement, it can request
the missing pieces: Dynamic ODBC Tracing. information about the format of the result set.” This is
what VFP does. After the call to SQLPrepare(), VFP
Dynamic ODBC Tracing queries ODBC for the number of columns that exist in the
The ODBC Manager has the ability to record all ODBC result set (see Figure 5). Then VFP makes a series of calls
function calls. All calls made between the application and to SQLColAttributes() to gather information about the
the ODBC Manager or between the ODBC Manager and column like the datatype, size, precision and scale (if
the ODBC Driver are captured and recorded in a log file. applicable), null support, and name (see Figure 6).
To enable ODBC Tracing, open the ODBC
Administrator. The ODBC Administrator must remain Conclusion
open while the Tracing occurs. On the Tracing tab, select The goal of this article was twofold. First, I wanted to
the name and location for the log file. After you’ve demonstrate two of my favorite tools that most
selected a file to capture the ODBC calls, click the Start developers either don’t understand or don’t even know
Tracing Now button. Any ODBC Function call made by an about: ODBC Tracing and the SQL Server Profiler. I’ve
application or the ODBC Manager will now be logged found both to be very helpful in determining the cause
into the file you previously specified. of problems when things look right in VFP but I don’t
To disable ODBC Tracing, click the Stop Tracing Now get the correct results from SQL Server. Second, I really
button on the Tracing tab of the ODBC Administrator. was curious how VFP was retrieving the necessary
Once the Tracing has been disabled, you’re free to close information from SQL Server to create an empty view.
the ODBC Administrator. You can use these same tools to help troubleshoot
problems you run into when connecting VFP and
Experiment number two SQL Server. ▲
This time I re-ran the same VFP commands as before, but I
Michael Levy is a consultant with ISResearch, Inc., a Microsoft Solution
Provider and Certified Technical Education Center. He’s also a Microsoft
Certified Solution Developer and a Microsoft Certified Trainer. Michael
specializes in using Visual Studio and SQL Server to solve business
problems. mlevy@isresearch.com.

Figure 4. It’s obvious that the output from ODBC Tracing was
meant for C/C++ programmers. This is the section of the Trace file
where VFP is calling SQLPrepare().

Figure 5. VFP queries ODBC for the number of columns in the Figure 6. An example of VFP querying ODBC for
result set. SQLNumResultCols() returns 10. column attributes.

16 FoxTalk April 1999 http://www.pinpub.com


What’s Really Inside FoxTalk

The Basics of Buffering


Jim Booth 6.0

In FoxPro 2.x, we spent a lot of effort to preserve record values committed immediately and irrevocably, then “no
so we could offer the users an opportunity to revert their buffering” is the way to go. Additionally, if you have
edits. VFP gives us a mechanism to make this issue easier to a table that’s read-only, then you can set buffering
handle. Data buffering is the tool to use for this. In this article to “None.”
and the next fewin this series, Jim examines the nuances of
data buffering and the other ancillary requirements in VFP. What is data buffering?
This month, he covers the basics of data buffering. In FoxPro 2.x, when we issued our GETs against fields in
a table, we said we were doing direct edits. We called

T
HE first question is, “Should I use data buffering at them direct because we assumed that the fields in the
all, or should I just keep on going with SCATTER table were being directly edited. We had no control over
and GATHER to handle the problem?” Well, the the updating process. In fact, FoxPro 2.x was doing the
answer isn’t simple. SCATTER and GATHER require that edit in a memory buffer of the record, which would later
the variables or array be scoped properly so that they’re be used to update the table. In FoxPro 2.x, we had no
visible to all of the objects that need to access them. Since control over the buffer—it would be written when FoxPro
variables are, by default, privately scoped (that is, they’re got around to it, and there was no way we could stop it
destroyed when the routine that created them terminates), from happening.
we have a problem. VFP’s data buffering mechanism gives us control
If, in a form method, I SCATTER MEMVAR, those over that edit buffer. It allows us, when buffering is on,
variables go out of scope as soon as the form’s method to control when and if the buffer gets committed to disk.
ends, unless I declare all of the variables as public before Data buffering is simply a technology that makes it easier
doing the SCATTER. Declaring variables public has its for us to do what we were doing before.
own set of problems that are beyond the scope of this VFP’s data buffering adds some functionality to
article, so let’s just agree that we don’t want to do that the process. With the use of functions like OldVal() and
unless there’s absolutely no other way to do what we CurVal(), we can find out what the buffer started with
want to do. for values and find out what’s currently on disk,
With the ability of VFP’s controls to be bound to data, respectively. There are also a number of other functions
it makes sense to have the data available during the that enhance the data buffering in VFP as we go along
creation of the control. The creation sequence is such that on this exploration.
the Data Environment (DE) of a form is created before any
of the controls are created. This allows the DE to open the Why five buffering modes?
data cursors before the controls try to bind to their There are five buffer modes. These are:
controlsources. If we bind the controls directly to the
fields in the cursors, this all works well; however, if we’re • None (1)
binding to memory variables, we must ensure that those • Pessimistic Row (2)
variables exist at the time the controls are created. • Optimistic Row (3)
This variable creation could be done in the form’s • Pessimistic Table (4)
Load event (which fires before the controls are created) by • Optimistic Table (5)
declaring the variables as public and then scattering to
them. Oops, there goes that public stuff again (where’s a None
vampire slayer when you need one?). The None (1) setting disables the data buffering and
Data buffering gives us the best of both worlds— causes VFP to act just like FoxPro 2.x in regards to
we can bind directly to fields, and we can control editing data.
the updating or reverting of the record on disk. By
binding directly to fields, the whole variable scoping Optimistic
problem disappears. With optimistic buffering, VFP doesn’t lock any records
Is there ever a situation where you might not want when editing begins. Instead, when an attempt to update
buffering? Yes. If one of the requirements is that edits are the table occurs, VFP checks to see whether the record it’s

http://www.pinpub.com FoxTalk April 1999 17


updating is the same as what the buffer started with. If because of an update conflict, while pessimistic buffering
they’re the same, the update is done; if not, then the won’t. Isn’t that enough reason to prefer pessimistic?
update isn’t done (more on this later). No. Using optimistic buffering, you can secure your own
lock when an edit begins and release the lock when you
Pessimistic want to, so optimistic buffering can provide the same
With pessimistic buffering, VFP first attempts to lock the functionality as pessimistic. The problem as I see it is that
record when an edit begins. Then, if it can get the lock, the pessimistic buffering gives no choice or control over what
edit is allowed. If VFP can’t get the lock, then an error to lock and when (it happens by magic), while optimistic
condition (“Record is in use by another”) occurs, and the gives me that control. I hate magic! So, my preference is
edit is rejected. optimistic buffering for everything. If I need to ensure the
writing of data, I’ll handle that in my code.
Row How about Row vs. Table? The same reasoning
Row buffering allows only one record to be dirty in the applied to optimistic vs. pessimistic can be used here.
buffer at a time. Dirty, as used here, means that the record Row buffering allows only one record to be dirty at a
has been edited in the buffer. An attempt to move the time, and it does a magic update if the record pointer
record pointer in a cursor that has row buffering on will moves. Table buffering does no magic stuff at all; because
cause VFP to implicitly attempt to update the table. it allows multiple records to be dirty at once, it doesn’t
need to update when the pointer moves. You can restrict
Table the editing to only one record at a time through the user
Table buffering allows multiple records to be dirty in a interface; don’t let them move to another record while an
cursor’s buffer. Moving the record pointer doesn’t have edit is in progress.
any implicit updating activity. Why is the implicit update a problem? VFP is object-
By combining the optimistic/pessimistic with the oriented, and as such we create classes that have specific
Row/Table, you get four separate buffer modes in behavior. We try to make these classes generic so they can
addition to the None setting. be used in multiple places. If you create a form, use Row
buffering, and then at a later time add another control to
What’s that sixth buffering mode? the form based on a class that looks something up in a
In the BufferModeOverride property for a cursor in a DE, cursor, you might very well introduce an unexpected
you’ll see six choices. The additional choice is “Use the automatic update of the buffer. Not only can this situation
form setting,” which causes VFP to use the BufferMode be created, it would be a bear to try to debug. With Table
setting for the form and to decide on Table or Row buffering, there’s no magic going on; therefore, the issue
buffering based on the type of control being used (more here becomes a non-issue.
on this later). So, then, my opinion is to use optimistic Table
buffering all the time. If I need single-record editing, I
Which one has the right stuff? code it. If I need ensured writes (pessimistic locks), I
First, let’s look at optimistic vs. pessimistic. Pessimistic code for it.
locks records on the server; it also ensures the right to
write before it allows the edit to begin. These seem like How and where do I set the buffer mode?
some very valid reasons to prefer pessimistic buffering. You have a variety of ways to set buffer modes.
Well, before we draw any conclusions, let’s look at the They’re the form’s BufferMode property, the
downside of pessimistic buffering. cursor’s BufferModeOverride property, and the
Pessimistic buffering locks the record when the edit CursorSetProp() function.
begins and holds that lock until the update occurs. This
means that if Mary starts an edit and then goes to lunch, Form’s BufferMode property
no one can work on the record that Mary is editing. The BufferMode property of a form gives you three
Pessimistic uses locks at the network server, which choices for setting the buffer mode: 0-None, 1-Pessimistic,
consume server resources. Often the network server has a and 2-Optimistic, respectively. The row or table option
limited number of simultaneous locks available, and if of the buffer mode is handled by the control that the
your number of users grows, you could end up with table is bound to; if it’s bound to a grid, Table buffering is
network errors when the pessimistic buffering tries to get used, and if it’s bound to another control, Row buffering
a lock and can’t. is used.
Optimistic buffering doesn’t secure locks until the
update occurs. It gets the lock, does the update, and Cursor’s BufferModeOverride property
releases the lock. These locks are held briefly. The The BufferModeOverride property for a cursor in a DE
downside is that optimistic buffering might fail to update allows six settings. The options are 0-None, 1-Use form

18 FoxTalk April 1999 http://www.pinpub.com


setting, 2-Pessimistic Row, 3-Optimistic Row, the form will automatically set the multilocks for you.
4-Pessimistic Table, and 5-Optimistic Table. The
BufferModeOverride should be set for every cursor How do you control the updates?
in the DE if you want a setting other than 1-Use form The updating process is controlled through two functions,
setting (the default). TableUpdate() and TableRevert(). The former does an
update of the record, and the latter undoes the edits and
CursorSetProp() reverts the buffer back to what’s currently in the record.
The CursorSetProp() function can be used to Next month, we’ll investigate these two functions in
programmatically set the buffer mode. The more depth.
CursorSetProp() function has the following syntax:
Conclusion
CURSORSETPROP(cProperty [, eExpression] ; “We can do this the hard way or we can do this the
[, cTableAlias | nWorkArea])
easy way.” Data buffering is the easy way, once you
To set buffering modes, the first argument is understand how it works. It takes less code to accomplish
“BUFFERING”, the second is the number of the buffer the same results as the FoxPro 2.x methodology, and it
mode desired (as listed in the BufferModeOverride allows for the exploitation of the data binding capabilities
section), and the last is the alias name of the cursor in of the VFP controls. There are a lot of choices related to
which you want to set the buffer mode. For example, if I data buffering. A careful examination of each of these
wanted to set optimistic Table buffering in a cursor with choices will go a long way in simplifying the decisions. ▲
the alias of Customer, I’d do the following:
Jim Booth is a Visual FoxPro developer and trainer. He has spoken at
CursorSetProp("buffering", 5, "Customer")
FoxPro conferences in North America and Europe. Jim has been a
recipient of the Microsoft Most Valuable Professional Award every year
In order to use CursorSetProp() to set a buffering since it was first presented in 1993. He is coauthor of Effective Techniques
mode, the SET MULTILOCKS option must be ON for Application Development and Visual FoxPro 3 Unleashed and is
(it’s off by default unless you’ve changed this in the contributing author for Special Edition Using Visual FoxPro 6.0. Visit his
Tools|Options dialog box). Setting Buffer modes in a form Web site at www.jamesbooth.com. 203-758-6942,
does require explicit setting of the multilocks option, as jbooth@jamesbooth.com.

Third-Party Reporting . . . the Subscriber Downloads.

Continued from page 13


Conclusion
FoxPro has a rich and powerful set of analysis and
different areas in the catchment (for example, build a reporting features, but it isn’t all things to all people.
factory on undeveloped parkland) using a geographic The smartest way to develop solutions is to use the
information system (GIS)—ArcView. FoxPro then reads appropriate tool for the task. If your application
the geographic data as well as some rainfall data and requires fast data access and a good GUI development
produces input files for a numerical model—SWMM. The environment, then by all means use FoxPro exclusively.
numerical model predicts how full each of the segments If there are more specialized requirements, then make
of pipe in the drainage network will run at each time step use of the specialized tools available and you’ll save
and outputs its results to a text file. yourself a whole heap of heartache.
In the reporting phase, FoxPro reads the text file and This is the last installment in the analysis and
sends the data to the GIS. The GIS produces a standard reporting series of this column. Next month, I’m going
.WMF file for each time step in which the pipe segments to discuss some techniques for matching records,
are colored and proportioned relative to the amount of including phonetic matching, address standardization,
water flowing through them at that time. Finally, a and other techniques. ▲
PowerPoint presentation is built from FoxPro using OLE
Automation so that each of the time steps can be 04COATSC.ZIP at www.pinpub.com/foxtalk
displayed in sequence. This gives the effect of viewing the
state of the network as the design storm progresses. Andrew Coates is a director of Civil Solutions, a PC development
Displaying the data in this way allows the users to consultancy in the Olympic City, Sydney, Australia. Andrew specializes
quickly see the effects of their design decisions. A sample in PC database applications, particularly in integrating components
of the PowerPoint presentation produced is included in and visualizing spatial data. a.coates@civilsolutions.com.au.

http://www.pinpub.com FoxTalk April 1999 19


The Kit Box FoxTalk

Buffering: The
Transaction Slayer
“A Discussion of Transactions and Buffering Modes
in the Context of the Construction of a Multi-layer
Commission Calculation Class”
Paul Maskens and Andy Kramek 6.0

This month, Paul and Andy look at table buffering and recalculate, and try again. But why are you assuming
transactions, based on the commission update problem from table buffering?
last month (see “Curses and Recurses”). If you’re not up to
speed with the basics behind buffering, see Jim Booth’s article Paul: I just assumed that we were going to use optimistic
on buffering on page 17. table buffering because I couldn’t see how to make any
other buffering system work in this case. After all, we’re
Paul: Last month, we created a class to calculate the going to be amending multiple records when we walk the
commissions. Having looked at some of the issues in salesman hierarchy calculating those commissions.
saving that data, I think that was the easy part. Now I
know why you tried to run away. The solution seems Andy: I don’t agree. Remember, the transaction is holding
slightly easier in VFP 5 than in VFP 3, but should we talk the locks, so we can use an explicit TABLEUPDATE() after
about both? each set of values from the array is copied to the record.
It doesn’t matter whether the buffering is table or row
Andy: I guess we should; we can’t afford to assume that buffering, because we’re always updating the row as we
every reader has gone to VFP 5 or VFP 6. As far as this get to it.
problem is concerned, we’re only going to use the basic
TABLEUPDATE(), which is common to versions 3, 5, Paul: Oh, I see. That makes it simpler. Before we get
and 6. Essentially, all we need to talk about is the started, I’d like to clear up a misunderstanding over
difference in the behavior of TABLEUPDATE(); the TABLEUPDATE() that I’ve seen in live code. It doesn’t
additional functionality is worth having, but we don’t help when the VFP Help file uses =TABLEUPDATE(.T.)
need it in this case. in its example. The return value isn’t tested, and it’s
assumed that the update will succeed. Updating the name
Paul: Well, here’s what I think happens. First, you have field in one record is hardly a real-world example, let
to update some records <g>. Then, when you issue alone good programming practice—that function return
TABLEUPDATE( .T., .F., ‘mytable’ ) in VFP 3, the updates result is provided for a reason!
are attempted. But FoxPro stops updating the table from When a TABLEUPDATE() returns .F., that means
the buffer as soon as the first error occurs. the update didn’t complete successfully. Code using
If you use VFP version 5, then you can use TABLEUPDATE() has to check that result or risk leaving
TABLEUPDATE( 2, .F., ‘mytable’, laErrors ), and FoxPro unsaved buffered changes. In VFP 3, those changes are
will again attempt to update all buffered rows, continuing detected when the table is closed, forcing an error. In VFP
after any errors and placing record numbers for each 5, those changes are simply lost.
record that can’t be updated in the fourth parameter into
the array passed. Andy: That’s the correct behavior. No other sensible
database management system would consider closing a
Andy: While that’s useful, we don’t need to use it because, table with uncommitted changes to be an error; it would
in this case, any failure means we have to recalculate the simply assume the changes aren’t required.
entire commission chain. So we just need to roll back,

20 FoxTalk April 1999 http://www.pinpub.com


Paul: Okay, now, with that out of the way, I agree that RETURN llOK
ENDFUNC
we need a transaction so that all the changes can be
undone. Am I right in assuming it’s because each Andy: It is. The strategy is to start a transaction, and then,
TABLEUPDATE() will save some data from the buffer, for each row in the stack, retrieve the information we
leaving that record marked as not updated any more? saved, find the appropriate record, and use a normal
What I need is to be able to revert the changes after FoxPro REPLACE to update the record. Then explicitly
they’ve been saved—which TABLEREVERT() won’t do. use a TABLEUPDATE() to try to save the change to that
record, and log the result of that update attempt.
Andy: Correct, you can’t use TABLEREVERT() at that As soon as an error condition occurs inside that
point because you’ve done a TABLEUPDATE() that’s loop, the status flag goes to false, the loop is exited
succeeded. Provided they all succeed, there’s no problem. immediately, and the transaction will be rolled back. The
status flag is set to false if the record can’t be located in
Paul: Well, there’s a little problem with transactions, isn’t the results table, if the record is located but the data in the
there? They can only be nested to five levels! Just where record has changed since we did the calculation, or if the
are those transactions going to be started and ended when update of the table after the replace fails.
saving the data? If the data needs to be updated more
than five levels up the hierarchy, won’t we run out of Paul: So you’re trapping errors at three levels: If another
transaction levels? user deleted the salesman’s record, if another user
updates the same record before we try the replace, and,
Andy: It’s a reasonable question, but that’s not how it finally, if another user updates the same record while
works. Look at the code <g>. I’ve only used one we’re updating the record. We aren’t using buffering
transaction around the whole hierarchical update, except for the very short time where we’re trying to write
because we’re interested in an all-or-nothing update. the update.

Paul: So the BEGIN TRANSACTION in the UpdResult() Andy: Right. I know it seems counterintuitive, but
method is the only one that’s needed? That seems simple. buffering is the last thing that we want. We need to
Here’s the relevant code from last month: process as little as possible if there’s an error, to make
the overall saving process as fast as possible. We are, in
*** Method to update the result table
*** Returns .T./.F. effect, using the stack as a buffer, to hold the changes we
PROTECTED FUNCTION UpdResult() want to apply.
LOCAL llOk, lnRows, lnCnt
PRIVATE pnResPk, pnOriCnt, pnOriVal, pnNewCnt, ;
pnNewVal
*** Start a transaction
Paul: So this updating process is now effectively atomic.
llOk = .T. Instead of just calling the UpdComm() method, we do
BEGIN TRANSACTION
lnRows = ALEN( This.aStack, 1 ) need to test that return result and handle a failure of that
FOR lnCnt = 1 TO lnRows in some way. Here’s the main routine again from last
STORE 0 TO pnResPk, pnOriCnt, pnOriVal, ;
pnNewCnt, pnNewVal month; I suppose this isn’t the right place to handle the
This.Pop( lnCnt ) retry, because we need to recalculate:
*** Find results record
llOK = This.FindRec( pnResPk, 'salesres', ;
'respk' ) *** Exposed Main control function
IF ! llOK *** Input parameters are the starting ID and
EXIT *** the base value
ENDIF *** Returns Success/Failure
*** Check that values are unchanged FUNCTION UpdComm( tnStmPK, tnValue)
IF salesres.rescount = pnOriCnt ; LOCAL lnNextID, llRetVal, lnCommDue
AND salesres.resvalue = pnOriVal *** Clear the stack
*** Replace data DIMENSION This.aStack[1, 5]
REPLACE rescount WITH pnNewCnt, ; STORE 0 TO This.aStack
resvalue WITH pnNewVal IN salesres *** Check parameters
*** Commit changes IF EMPTY( tnStmPK ) OR EMPTY( tnValue )
llOk = TABLEUPDATE( .F., .F., 'salesres' ) *** Must pass a PK and value
IF ! llOk RETURN .F.
EXIT ELSE
ENDIF lnNextID = tnStmPK
ELSE ENDIF
*** Exit with error if not *** Start the Main loop here
llOk = .F. llRetVal = .T.
EXIT DO WHILE ! EMPTY( lnNextID )
ENDIF *** Find the record in SaleTeam
NEXT llRetVal = This.FindRec( lnNextID, 'saleteam', ;
IF llOk 'stmpk' )
END TRANSACTION IF ! llRetVal
ELSE *** PK doesn't exist in SaleTeam
ROLLBACK EXIT
ENDIF ENDIF
*** Return status *** Get Next ID to use from SaleTeam

http://www.pinpub.com FoxTalk April 1999 21


lnNextID = saleteam.stm2FKmgr Paul: It actually could improve performance. Think about
*** Now find the correct result record
llRetVal = This.FindRec( saleteam.stmpk, ; it for a moment. Do you have 12 data entry operators, all
'salesres', 'resstmfk' )
IF ! llRetVal
running the same code to make the changes and attempt
*** PK doesn't exist in Results to resolve the conflicts? Or is it better to serialize the
EXIT
ENDIF updates and reduce the contention?
*** Calculate the commission payable
lnCommDue = This.CalcComm( saleteam.stmlevel, ;
tnValue ) Andy: Yes, your update will never fail. Oh, you’d need a
*** Push the data to the stack date/time stamp, of course.
This.Push( lnCommDue )
ENDDO
*** If no failures along the way
IF llRetVal
Paul: Why? If all that’s being done is increment and
llRetVal = This.UpdResult() addition, it doesn’t really matter if the changes of two
ENDIF
RETURN llRetVal
users are interleaved in the queue—the end result is
ENDFUNC the same.

Andy: Right. The application is going to have to handle a Andy: Agreed, unless it became important to apply
failure in UpdComm(). That responsibility belongs in the updates only for a particular period. You might want to
application. I don’t think it even belongs in the instance. process updates only until the 31st of the month or
In answering the must/should/could question, the code something; I’d add the timestamp on principle. I’d prefer
could be there, which suggests a subclass. But the to have it there just in case rather than have to modify the
handling of those sorts of errors must be application- system and put it in later.
specific, so it belongs in the application; otherwise, you
have one subclass per application, which is pointless. Paul: I can see you might want it for recovery, too. Perhaps
to reapply all updates for the week after restoring from
Paul: A simple approach would be to have a loop that calls the backup tape.
UpdComm() a number of times, retrying until either it
succeeds or the user runs out of patience, or you’ve Andy: It’s the sort of thing that will cost you virtually
performed a set number of retries. nothing to do, but you might find that there are
thousands of uses for it—management information, for
Andy: Sounds about right. It’s not our problem, though, is example. After all, it would be a tiny file—something you
it? As you say, it’s an application-level SEP. could keep for ages.

Paul: I do have another really sneaky solution that solves Paul: If we use queueing, is all of this is irrelevant? We
the multiple user update problem. could just write the information straight into the queue
from the stack, instead of using buffering or transactions
Andy: Go on then, I’m sure I’ll like it. in this process.

Paul: Do it single user. Andy: You mean APPEND FROM ARRAY THIS.aStack?
It’s just as well that I didn’t acquiesce to your insistence
Andy: Okay, I like it. What on earth are you talking about? on using a true stack; you’d have to pop off that stack into
another array first.
Paul: If you don’t need the changes in real time, then write
the data to make the changes into a queue file. Service that Paul: Maybe so, but that was more by luck than
queue with a program that runs continuously, taking the judgment <g>.
oldest change and applying it to the table (marking it What actually seemed to be quite a difficult problem
applied, of course). Since it’s now single user (there’s only has actually proven to be rather simple, and we have
one daemon), there’s no update conflict to resolve. two solutions: one using an explicit tableupdate and a
I actually thought of it before looking at Oracle, but transaction, and one using a queue. ▲
it’s a “new feature” in Oracle8. They claim it’s a
particularly good way to improve the speed of data entry.
Paul Maskens is a VFP specialist and FoxPro MVP who works as
programming manager for Euphony Communications Ltd. He’s based in
Andy: All we need is the Primary Key and an increment Oxford, England. pmaskens@compuserve.com.
for the count and the value. We don’t need the results
themselves, only the data to perform the calculation. In Andy Kramek is an old FoxPro developer, FoxPro MVP, and independent
fact, we only need the value to add; we’re always going to contractor and occasional author based in Birmingham, England, who
increment by one. now works in the U.S. 104074.3130@compuserve.com.

22 FoxTalk April 1999 http://www.pinpub.com


Downloads April Subscriber Downloads
• 04COATSC.ZIP—Source code described in Andrew Coates’ article, “Seeing Patterns: Chain of Responsibility.”
article, “Third-Party Reporting Tools.”
• 04HENNIG.HTM—“Reusable Tools: Building a Builder
Extended Articles Builder.” Creating builders is a snap using enhancements
• 04DONNIC.HTM—“Best Practices: Seeing Patterns: Chain of to Ken Levy’s BuilderD technology. Doug Hennig shows
Responsibility.” This month, the Best Practices column you how.
continues the “Seeing Patterns” series, which discusses
common object-oriented design patterns. The purpose of • 04DHENSC.ZIP—Source code described in Doug Hennig’s
this series of columns is to describe each pattern briefly and article, “Building a Builder Builder.”
then demonstrate examples of the patterns in Visual FoxPro
applications. The focus of this column is on the Chain of • 04HENTZE.HTM—“Visual Basic for Dataheads: A Command
Responsibility pattern, which allows for passing messages and Function Summary—Part 2.” In Part 1 of this series,
through various objects, giving each object an opportunity you got a glimpse of the language through a summary
to act on the message. of functions. This month, Whil Hentzen covers the logic
structures and commonly used commands in Visual Basic
• 04DONNIC.ZIP—Source code described in Jeff Donnici’s and compares them to their counterparts in VFP.

Editorial: Microsoft . . . Second, you’ll soon see a flurry of competitive


mechanisms arise as each of these companies struggles for
Continued from page 3
market share. I predict that by the summer, you’ll be able
In fact, news of Microsoft’s recent reorganization, to buy Windows via kiosks on every street corner, and
where the company was being aligned across customer you’ll be able to “pay as you go”—buying two weeks of
groups instead of products, was seen by some as a Windows at a time, for example.
defense against such a breakup plan, since the breakup You think I’m joking? Knowledge Base article
would presumably be made more difficult after the #216641 (Title: “Computer Hangs After 49.7 Days,”
reorganization. Symptom: “After 49.7 days of continuous operation, your
Of course, both of these were wrong. Windows 95-based computer might stop responding.”)
Microsoft won’t be divided into pieces across product indicates that a regular fill-up of Windows might be a
lines—one company responsible for Windows, another for good idea. I wonder if, when you pull up to the full-
developer tools, a third for edutainment products. Nor service kiosk, a uniformed attendant will wash your
will the company be split into functional lines, such as for monitor’s screen for you?
business, home use, and so on. Instead, the company’s You’ll also probably see a few name changes.
breakup will be divided into geographic areas, just like Microsoft New Jersey will become something goofy, like
Standard Oil was a century ago. Imagine: Microsoft New ExxonSoft. Microsoft Kentucky will rename its flagship
York. Microsoft Texas. Microsoft L.A. Fascinating, eh? product “Microsoft Winders.” I don’t know any of this for
But fascination doesn’t do much for you when a fact—but events like these are bound to happen.
you’re trying to deliver an application. What does this Finally, you’ll be able to buy stock in companies like
mean to you? Microsoft Georgia, Microsoft New York, Microsoft Ohio,
First, you’ll no longer have to listen to your customers and Microsoft Illinois, thus allowing you to diversify your
whine about Bill Gates being the richest man in the world portfolio. And you can stop complaining about the tepid
(“Gee, I’d sure like to be in his shoes.”). He will become, pace of government action, because this will all become
as Jay Leno presciently joked recently, the world’s five possible as of, not coincidentally, April 1, 1999. I know I
richest men. can’t wait. ▲

http://www.pinpub.com FoxTalk April 1999 23


FoxTalk Subscription Information: Editor Whil Hentzen; Publisher Robert Williford;
Vice President/General Manager Connie Austin;
1-800-788-1900 or http://www.pinpub.com Managing Editor Heidi Frost; Copy Editor Farion Grove

Subscription rates: Direct all editorial, advertising, or subscription-


related questions to Pinnacle Publishing, Inc.:
United States: One year (12 issues): $179; two years (24 issues): $259
1-800-788-1900 or 770-565-1763
Canada:* One year: $194; two years: $289
Fax: 770-565-8232
Other:* One year: $199; two years: $299
Pinnacle Publishing, Inc.
Single issue rate: $17.50 ($20 in Canada; $22.50 outside North America)* PO Box 72255
European newsletter orders: Australian newsletter orders: Marietta, GA 30007-2255
Tomalin Associates, Unit 22, The Bardfield Centre, Ashpoint Pty., Ltd., 9 Arthur Street,
Braintree Road, Great Bardfield, Dover Heights, N.S.W. 2030, Australia. E-mail: foxtalk@pinpub.com
Essex CM7 4SL, United Kingdom. Phone: +61 2-9371-7399. Fax: +61 2-9371-0180.
Phone: +44 1371 811299. Fax: +44 1371 811283. E-mail: sales@ashpoint.com.au
Pinnacle Web Site: http://www.pinpub.com
E-mail: 100126.1003@compuserve.com. Internet: http://www.ashpoint.com.au FoxPro technical support:
* Funds must be in U.S. currency. Call Microsoft at 425-635-7191 (Windows)
or 425-635-7192 (Macintosh)

FoxTalk (ISSN 1042-6302) is published monthly (12 times per year) Brand and product names are trademarks or registered trademarks publication is sold as is, without warranty of any kind, either express
by Pinnacle Publishing, Inc., 1503 Johnson Ferry Road, Suite 100, of their respective holders. Microsoft is a registered trademark of or implied, respecting the contents of this publication, including
Marietta, GA 30062. The subscription price of domestic Microsoft Corporation. The Fox Head logo, FoxBASE+, FoxPro, and but not limited to implied warranties for the publication,
subscriptions is: 12 issues, $179; 24 issues, $259. POSTMASTER: Send Visual FoxPro are registered trademarks of Microsoft Corporation. performance, quality, merchantability, or fitness for any particular
address changes to FoxTalk, PO Box 72255, Marietta, GA 30007-2255. FoxTalk is an independent publication not affiliated with Microsoft purpose. Pinnacle Publishing, Inc., shall not be liable to the
Corporation. Microsoft Corporation is not responsible in any way for purchaser or any other person or entity with respect to any liability,
Copyright © 1999 by Pinnacle Publishing, Inc. All rights reserved. No the editorial policy or other contents of the publication. loss, or damage caused or alleged to be caused directly or indirectly
part of this periodical may be used or reproduced in any fashion by this publication. Articles published in FoxTalk reflect the views of
whatsoever (except in the case of brief quotations embodied in This publication is intended as a general guide. It covers a highly their authors; they may or may not reflect the view of Pinnacle
critical articles and reviews) without the prior written consent of technical and complex subject and should not be used for making Publishing, Inc. Inclusion of advertising inserts does not constitute
Pinnacle Publishing, Inc. Printed in the United States of America. decisions concerning specific products or applications. This an endorsement by Pinnacle Publishing, Inc. or FoxTalk.

The Subscriber Downloads portion of the FoxTalk Web site is available to paid
subscribers only. To access the files, go to www.pinpub.com/foxtalk, click on User name ginger
“Subscriber Downloads,” select the file(s) you want from this issue, and enter the
user name and password at right when prompted. Passwordd haunt
asswor
or

24 FoxTalk April 1999 http://www.pinpub.com

You might also like