What Is VC?: 1. General Implementation Concepts For VC

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 22

1.

General Implementation Concepts for VC

What is VC?

'VC' stands for variant configuration. It represents the process of generating a customized
assembly for a customer's requirements through the selection of product options, without
needing to create a different part number for each customer's variation of the product.

That didn't help much? Most of us are familiar with the process of ordering a custom
computer online. When you log in to the web site, you see a couple of general models to
choose from. You can then select one of these models, and customize it. Each of those
models is a configurable material. After you choose a model, you see a list of options that
you may select from. You are now in the model's configurator. As you select your desired
options, the configurator records them. When you place your order, your options are
communicated to the production line, where the operators choose exactly the items you
requested and assemble it into your customized computer. The company never needs to
make a single new part number in order to fill your order. This whole sequence of events is
what makes up variant configuration.

SAP also provides a configurator, complete with the ability to create rules that govern the
interactions and effects of selection of the various available options. It also has a great deal
of built-in integration to convert these options into usable downstream requirements for
production, purchasing, and the like.

SAP's configuration engine is extremely powerful, but its flexibility also makes it a bit
difficult to set up. Anyone who has used it before will also tell you that it is a bit quirky.
Thus, setting up a configurable product often seems like more of an art than a science.

It doesn't need to be. With proper management and execution, VC rollouts should be no
more difficult than any other SAP module (if you can excuse my calling VC a module for the
time being).

Should a VC project be treated as a technical project or a functional one?

The short of it is that a VC project can neither be treated as a purely functional one nor a
purely technical one. Both approaches invite disaster, and even if the disaster is successfully
navigated, the project costs and timelines will certainly run long.

A well-run VC project is treats the subject as a process initiative. It often entails a


fundamental shift in the approach to distributing a type of product. Marketing, engineering,
forecasting, pricing, sales, planning, procurement, inventory, production, shipping,
invoicing, accounting, warranty management, and customer service are all typically affected
by a VC implementation.

The development of a product in a VC environment should look at all of these as aspects of


one VC process. Even though SAP lists VC beneath the 'Logistics General'• module, VC is
actually a superset of MM, PP, PS, SD, and classification.

Isn't VC just SAP's product configurator?


If you want to implement VC, and you think or speak of VC as a product configurator, even
euphemistically, you are already heading in the wrong direction. The configurator is just one
tool in the VC product model. There are many others which must be implemented well to
have a successful VC rollout.

It is better to think of VC as 'the process by which a highly variable product is delivered to a


customer according to their particular desires.'• Defining this process is the first thing you
have to do. Structuring the tools to help you make the process transition necessarily must
follow this exercise. If you build the toolset first, count on your work being considered a
prototype (at best) or a complete waste of effort (at worst).

Not convinced? What would happen if you treat VC as simply a product configurator? Well, if
bills of materials and routings are not structured properly, you will not be able to build the
product. If the product data is structured in an overly technical manner, support must be
provided by an IT department, meaning that market flexibility and customer service both
suffer. If marketing and sales are not involved, changes in pricing and scheduling will cause
your customers to lose confidence in you as a business partner. If engineering and
marketing are not aligned properly, changes made by one will not be visible to the other,
and what the customer receives will not match what they ordered. The list goes on and on.

VC should not be taken lightly.

Should I use VC?

Just because something has always been sold with options doesn't mean that VC is a
natural fit. Are there other ways to model the product? As always, of course there are. You
can probably model the data in SAP the same way as you do now. There are obviously
drawbacks to this, which you are undoubtedly all too aware of. But you would be dealing
with a known, and managing a known commodity is always easier than taking on something
new.

Taking on VC mandates that you take on a complete review and possibly a complete
restructuring of your bill of materials, material masters, production routings, purchasing
processes, customer pricing, and change management processes. This is a major effort, and
if you are not ready to take that level of effort on, it is better to wait on VC. It can be rolled
out later. [As a side note, if the products are fairly simple, the potential impacts (good or
bad) from hurriedly rolling out VC will probably not be significant.]

On the other hand, if you have complex products, VC can definitely be worth the pains
involved in implementing it. The investment you make up front will reduce the efforts made
on each order if you currently have to validate, schedule, or otherwise coordinate activities.
If done properly, your data accuracy will soar, support costs will drop, customer satisfaction
rates will improve, inventory levels will drop, planning will improve, and lead times will
shorten. It can be a tremendous competitive advantage if you market your capabilities well.

It sounds like a lot to promise, but when you read on, you will see that it isn't VC itself that
is making these things happen. Rather, you need to fix the processes that are holding you
back from such goals if you hope for success using VC.

Again, VC should not be taken lightly.


Why have companies not lauded VC after implementation?

After seeing good implementations and bad ones, I firmly believe that there are remarkably
few project managers who incorporate VC properly. There are some incredibly proficient VC
consultants making sound, consistent models that lead to disaster for their companies.

All too often, this is because project management treats VC as an obtuse, technical hybrid
between other SAP modules. Because of this, the consultants take on a technical bent,
trying to build a workable data structure to sell the product as they are modeled in a legacy
environment. Mix in enough junior consultants who are just trying to figure out the more
technical aspects of VC, and you've got a formula for problems.

In most cases, managing the data conversion becomes such a challenge that fixing process
issues gets forgotten, or set aside. The thing is, once those process issues are put aside,
you are forced to superimpose the dreamed VC product models with existing standard
product models, and it doesn't work. After conversion, manufacturing delays and inventory
problems abound, and support costs skyrocket. Some companies have never recovered
from this approach.

How then, do I approach a VC implementation?

Treat VC as a project in its own right. Just like the overall SAP project, there are process
issues, data issues, employee issues, and technical issues that all must be dealt with. If you
have just one VC consultant, he/she probably excels in only one or two of these areas. You
should hedge your bets, and make sure you have someone from your business identified to
cover each of the aforementioned areas.

Ultimately, the goal should be to develop a complete solution that can be handed off to
internal users with a minimal amount of technical training and ongoing support cost. Make
this a stated goal, establish targets to measure yourself against, and start planning to those
targets.

All of the concepts below are described under the assumption that your configurable
materials are complex, business-critical products. Depending on your particular situation,
some or even most of these points will not be as significant to worry about. You, of course,
will need to make those judgments.

The more technical portions of the discussion will be broken out into a separate topic, so
that it can be more readily digested by those readers who really need to think about the
technical details.

What process initiatives should I investigate?

Your process team should focus on developing a lean, customer-focused make-to-order


logistics cycle (depending on your business, this may be assemble-to-order, configure-to-
order, or engineer-to-order). Some things you may need to address:
Excessive product management - Organize your products into groups or families, and make
each of those a configurable material. SAP is not built to handle every custom version of the
product being treated as a new part number, and this practice is extremely detrimental to
ongoing BOM maintenance costs and accuracy.

Process control points - Decide what, if anything, your order control points add in the way of
value for your customer. If they don't do your customer any good, get rid of them or
automate them.

Product representation - Align your marketing, engineering, and manufacturing views of the
product, so that everyone is working towards the customer's view of the product. Their daily
activities may not line up exactly, but the results from each department must.

Lean manufacturing - Implement lean manufacturing concepts so that a modular bill of


materials can facilitate assembly speed instead of hindering it. Lean manufacturing also
creates flexibility, reduces response times, and reduces inventory costs. A lean bill of
materials is particularly important to develop. Lean manufacturing is a major topic in itself,
so I will not attempt to address it in detail here.

Product costing and pricing - Make sure your pricing model is aligned to your costing model.
If you can set up your options so that a customer decision that affects pricing directly
correlates to a costed assembly, then you have a mechanism for establishing pricing for and
determining the profitability of each option. Otherwise, you can only look at profitability for
the entire product line and make educated guesses about the pricing.

A simple of of thinking about this is, build it the way you sell it.

Order scheduling - Streamline and coordinate your quoting, production scheduling, and
delivery promising processes to take advantage of the integration in SAP and reduce your
management workload. SAP is able to auto-schedule an order for production. Let it do so
whenever you can. To do this, you will need to clarify and simplify your rules and processes
for handling expedite requests. [side note: I am not personally a fan of over-integrating
networks and sales orders, so if you are using PS, don't go too far too fast - auto-scheduling
at order entry probably won't help you much, anyway]

Custom engineering - If you allow custom engineered subassemblies in an assemble-to-


order environment, pinpoint the assemblies that customers may customize. Modularize
them, so that the other subassemblies can be built without engineering intervention. If this
is not possible, you may need to consider moving to a full-blown engineer-to-order
environment.

Product change management - Do not allow data maintenance to occur in a silo. All changes
to a product should occur in a highly regulated team environment that involves engineering,
manufacturing, and marketing. You should implement both systematic and organizational
controls to ensure that the three departments stay perpetually in sync.

System-wise, expect to implement ECM (engineering change management) together with


status management. I've seen companies try to use separate systems to prepare data (SAP
sometimes even promotes this with their ALE tools), but ECM combined with a rigid status
management is the only control that really works. I will discuss this more in the data
section.
Often, companies find that new roles must be defined to ensure that consistency between
the departments is enforced. Sooner or later, everyone realizes that enforcement is critical.
If you need to create a new position of clout in order to make sure everyone moves in
concert, do it.

Order change management - Streamline change processing activities to support your newly
integrated environment. Change orders can directly affect released production orders, so
new controls will need to be established. Make them simple and efficient so that you do not
lose your reduced production cycles.

Planning - Associate planning needs with product attributes so that you can plan your
material requirements by option. This will also place you in a better position to combine
usage history and your order pipeline (quotes / inquiries) in your planning activities.

Procurement - Take advantage of the lean bill of materials and arrange for product
attribute-driven ordering with vendors where possible. Subassemblies may themselves be
considered configured products, and subcontracted assemblies may require that your
vendors be able to handle configured part orders from you.

Direct customer ordering - Providing the customer with the ability to play-test a
configuration and place an order on their own terms creates a competitive advantage,
creates a data source for marketing analysis, and can reduce total transaction cost.

It is usually not a necessity, so you may want to consider adding it immediately after your
internal rollout. If your products are very complex, you may even be forced to use an
external configurator such as IPC, at which time opening the door for a select group of
customers may not be much of a stretch.

What data maintenance initiatives should I investigate?

The data side of the project should focus on establishing data accuracy and minimizing the
amount of data needed to support the VC process. Some things you may need to address:

Clean data - Eradicate BOM, pricing, and purchasing data errors, and make sure that
component / assembly lead times are as accurate as possible. Clean data is one of the most
critical get-ready activities you will have. SAP has a way of magnifying data errors, and
mixing them in with new processes will not be pleasant.

A special note about lead times - if your implementation is highly integrated, padding lead
times will cause your orders to schedule improperly in production. Since production orders
may be automatically generated, a bad item in a bill of materials will cause the order to be
unbuildable.

Option organization - Spend some time determining which characteristics (product


attributes) and classes (groups of characteristics) are needed before you actually create
them. Use a spreadsheet of some sort to organize them and streamline them, so that you
can quickly identify inconsistencies and data errors.

The process of defining your configurable products can be greatly accelerated by simplifying
customer options wherever possible, and consistently reapplying characteristics and classes
across product lines. Take advantage of the fact that a single product may have several
classes of characteristics, so that future maintenance is reduced. Just be sure not to overdo
this - keep the groupings aligned to the nature of your product, so that the data is not
confusing. In a good design, the data structure should reinforce the construction of the
product itself.

Don't even think about developing rules until you've completed this process. They will be
wrong, or worse, they will lead you into building characteristics because of a perception of
need that may not really exist.

Lean bills of materials - Organize your BOMs into lean, flat, 'super BOMs'•. Each
configurable material should have only one super BOM, which contains a component or
assembly for each selectable option. Rules (object dependencies) will need to be built to
decide when and how each item will be used.

It is often useful to make the BOMs modular, so that if an option is used in more than one
configurable product, the same subassembly may be assigned to both super BOMs. Locating
option usage and making BOM changes will be much easier this way.

Creating a lean super BOM structure is critical to the success of the VC project, as it will
directly translate to data maintenance costs - VC will bring additional maintenance
requirements, and they will need to be offset by savings in BOM maintenance.

Lean production routings - Just as you did with the bills of materials, organize your
production operations into lean 'super routings.• This has the same benefits, and is usually
a fairly simple exercise in comparison.

While you are doing this, associate product attributes with production impacts - if a
selection affects labor times, note it. If you miss this important step, your production lead
times and product costing will both suffer.

Stockable variants - If you have some commonly ordered variations of a product, create
material variants for them. A material variant may use the same configuration rules and
bills of materials as the configurable product it is based on, but it can be stocked and
shipped from stock.

Maintenance processes - You will need to develop a procedure to maintain your data in such
a way that integration visibility is not lost. The more mission-critical your product line is, the
more mission-critical this activity is. If you do not set up any procedures at all, data can
easily be changed in such a way as to cause production orders to be built incorrectly or
customer invoicing to completely fail.

I have seen three methods of maintaining the data, and each can be used successfully. You
can use ECM, ALEs, or a combination of the two. I personally prefer using ECM together with
a healthy dose of status management. It may seem like more work at first, but it is far less
painful in a live environment. The concept is to put release status on each data element,
and one on the ECM record they all must be tied to. If any of the data is not released, the
ECM record cannot be released, and the change cannot take effect. If only certain
individuals can release the ECM record, you have a pretty secure method of releasing
changes into production. The trick is to design the status management well. It has both
procedural and technical implications.

The method SAP often recommends is to set up a separate production client where data is
prepared, and subsequently released together to the primary production client. If you have
ever heard VC associated with ALE (application link embedding), it was because someone
was using this approach, as ALE is the tool used to migrate the data back and forth.

I am not a fan of the ALE approach. There are two big problems with it. The biggest
problem is with the separate system theory itself - it makes the assumption that VC is just a
configurator with data maintained in a silo, and as such, it always misses critical integration
points. I have never seen it done without data eventually getting out of sync, causing
production orders and invoices to get created incorrectly, thus causing the very events it
was intended to prevent. The second problem is the technical costs of keeping two
production clients synchronized. It is a painful exercise, to say the least.

The third data maintenance method is the worst of all - attempting to mix methods one and
two. Maybe that's a bit strong, because it's leaps and bounds above not having any controls
at all, but in my opinion, it's the worst of the three I've listed. You will find that some
changes (like making parts available for procurement) must take place in your productive
environment ahead of the product rollout. Inevitably, the ECM records in the two systems
will get out of sync, and you will have a very unpleasant surprise on your hands. You will
probably get the worst of both worlds, instead of the best of both, as you had intended.
Often, companies that try both find themselves backing off the dual-client approach,
because they find ECM to be too valuable to cast off.

Again, count on using ECM and a lot of status management.

2. The Basics of Product Modeling

Product Modeling

The technical side of your VC project should be focused on making a lean product model
that is simple to manage. All too often, a VC rollout leaves a client with a mess of obscure,
inconsistently written object dependencies. Since the client usually has only one or maybe a
handful of individuals trained to maintain dependencies, they are left with a costly,
unsupportable data maintenance model that is critically susceptible to turnover.

The biggest cause of this is simply modeling a product in VC without considering all of the
functional implications that a VC process brings. When you jump headlong into developing
dependencies for a complex product, it is extremely difficult to come up for air, much less
think about your conceptual approach.

If you have a technical individual associated with your VC project - and chances are, you do
- spend some time on your overall design. Don't even prototype for the first couple of weeks
or even months. If you build a consistent, clean global model, the process of modeling each
product will go much quicker.

If your product is very simple, with just a handful of characteristics, and rare changes, this
is not so important. One rule for this might be, 'can I keep the entire product model in my
head at one time?' • If the answer is yes, then go ahead and do as little as you want. If the
answer is no, then plan before you act.

For the sake of discussion, let's assume your product is of the not-so-simple variety...
Please be aware that this document uses a number of SAP terms without attempting to
explain them. It simply will not be possible to go into that level of detail here - be sure to
refer back to other data sources for help if any term doesn't make sense.

Getting Started

I will do something a bit unorthodox in the SAP consulting world, and say, let'™s forget
about standard functionality for a while. Cleanse your mind of all thoughts of SAP and
legacy systems, and the way things have been done or will have to be done. Why? You can
artificially limit your model by constraints that don't really exist (no pun intended).

Instead, think about how the product would be most simply and elegantly modeled in a
beautiful world. Organize your products, find similar product attributes, and really classify
them - not in the SAP functional sense, but in the true taxonomical sense of the word.
Eliminate redundancies, find similarities, correct names, and so on.

Then, tie this view to your manufacturing and marketing views of the product. Associate
your production assemblies with customer options, and make sure they really line up. You
may have to note impacts in your current business model, but don't let those impacts stop
you - save those discussions for the prototyping phase. You also may have to restructure
your dream view a number of times until you end up with a clean, consistent view of the
product.

Once you have gotten to this point, start churning through whatever legacy systems are out
there, scrub the data, and line it up with your model. It is very important to work from the
standpoint of the legacy systems, because it is the only way that you will be able to ensure
that accuracy is built into your model.

Next, simplify your product model and make it even easier. I don't mean to say you should
only consider a subset of the options - I mean that you should think abstractly, and reduce
the model to the lowest possible complexity. Start over if you need to. This is the best
opportunity you will have for do-overs.

If you did a good job, anyone who understands the product should be able to make changes
easily in your dream model. You should also be able to recreate any real-life example, at
least on paper.

After you finish this exercise, look at your results. Did you end up with a bunch of cryptic
dependencies? I'll bet not. More than likely, you have some very crisp spreadsheets,
decision trees, truth tables, and the like. The nature of the product should be extremely
visible simply by looking at your results.

Requirements Definition

It's time to translate your requirements into characteristics, classes, and dependencies,
right?

Wrong. Keep SAP out of your head for just a little bit longer. You do not have a
requirements document yet. What you have is a true knowledgebase document that you will
use to drive your further activities. It may not be the finished picture, but it's probably the
best one the company has. Now, it's time to translate this into a true requirements
document (again, without SAP or any legacy system in your head!).

Defining characteristics and classes is the easy part. They should scream out of your
knowledgebase document, as will dependency rules. It's probably why you were tempted to
start building dependencies in SAP already. Go ahead and put those terms on your
document for your own sake, just so you can stop thinking about them.

Where you need to focus is on how a user will maintain this data. Dream a little bit. Do you
desire a nice spreadsheet entry system? Drag/drop maintenance? Simple tables? Great. You
are heading in the right direction. If you would like it, chances are that someone with less of
a technical mindset than you will need what you are thinking of.

Now, you have the requirements for the model you will implement. Translate it into SAP
terms, and you're ready to go. There's a lot of hidden meaning packed into that sentence,
so read it again if you need to.

Obviously, this is where the hard work is going to be. You may even have to restructure
your knowledgebase document again. Whatever the case, invest some time here, because
the savings in the steps to come will more than make up for it, and the client will be many
times happier for your efforts. Instead of pleasing them, it's time to wow them.

Now, you're ready to draft your technical requirements document. Here, you have a number
of tools at your disposal: object dependencies, class nodes, variant tables, variant functions,
and reference characteristics, to name a few.

To translate your functional requirements into technical ones, you first should understand
what these tools are, and what they do and do not give you.

Object Dependencies

Object dependencies are the standard method for placing rules on your product
configuration. To understand them, it is best to place them in the proper context.

During configuration, there are four types of dependencies. Preconditions say when a
characteristic or value may appear. Selection conditions say when some characteristic or
value must be selected. Procedures cause some event to occur, such as setting another
value or changing the pricing quantity. When they are assigned to a characteristic, they
execute when the user enters a value for the characteristic; when assigned to the profile,
they execute when the user enters any value. Constraints act as a combination of the other
three, and allow you to limit, require, or infer characteristic values. Constraints should
generally not be mixed with preconditions or procedures (there are exceptions, of course).
Because constraints do not execute sequentially with procedures, interactions can be
unpredictable, and resolving logical conflicts can be difficult.

During a BOM explosion (SD/PP), task list generation (PP), or network generation (PS),
there are two dependency types available. Selection conditions govern when a specific
component/assembly, operation, or activity is selected. Procedures may be used to infer
values (such as quantities) into BOM items, routing operations, or network activities. Both
of these are actually executed during the result generation, and not during the configuration
event.

Actions are obsolete forms of procedures which are poor, unpredictable performers. Avoid
them - anything that could be done via an action can be done with a procedure.

In all cases, the best rule of thumb is to test each and every dependency immediately after
creating it. This makes the process necessarily slow and tedious. But, if you do not do this
while you create the data, the first real test will be post-go-live. No one can effectively wade
through 10,000 dependencies during a single test phase, and test every combination for
interaction effects.

If the selection conditions on your bills of materials are simple enough, and you use very
vanilla object dependencies in your configuration, you can take advantage of characteristics
planning. Characteristics planning allows forecasts to be generated for assemblies /
components based on expected utilization of characteristic values. The user places a
percentage next to each value (e.g., 20% red, 30% blue, and 50% green), and SAP
essentially multiplies out every combination of values that can affect BOM selection to
determine future procurement requirements.

If you use dependencies that are very complex, however, characteristics planning may not
function, and post-go-live support for dependency writing may be very high. Dependencies
are quirky, undocumented, and a bit obtuse for most users. The tool SAP gives you to
maintain dependencies (the editor) is also a pain in the neck to use, and does little to help
you along. The best dependency writers are coders or engineers who aren't afraid to hack
around a bit. Unfortunately, those are the same individuals least likely to be thorough
testers!

Class Nodes

Class nodes are useful for simplifying BOM item selections. This involves classifying the
components or activities to be selected, and assigning the class to the BOM as an item (item
category K). If the values specified in the configuration match the values in the
classification, then the component is selected. It is very much like variant type matching in
a sales order.

The nice thing about class nodes is that if your configuration process is simple, you can do it
without any dependencies. Or, if you have a split maintenance model, where one person
maintains the configurator rules, and another manages BOM item selection, the second one
doesn't need to learn to write dependencies.

There are two limitations to using class nodes - you cannot build 'OR'• relationships (e.g.,
component X is used under either of two conditions), and you cannot use characteristics
planning.

Variant Tables

Variant tables allow you to hold your rules in tabular form, so they can be managed much
like a spreadsheet. They can only be used in conjunction with an object dependency, but
they may be used with almost any object dependency.

Variant tables may be created as either VC-only objects, or they may be created with
reference to a database table. If you have very few and very simple rules, you will not need
a database table. Otherwise, database tables are essential.

During configuration, SAP uses them in a 'lookup'• context. That means, you have to tell it
exactly what to look for in the table, and it does just that. The behavior of the lookup is a
little different in each dependency type.

In a precondition or selection condition, SAP looks for a table entry exactly matching the
one specified in the object dependency. If it finds a match, the condition is met. Otherwise,
the condition fails.

Unfortunately, since the lookup must have an exact match, and there is no way to say read
the table with a range lookup, you may find that you have to make hundreds or even
thousands of table entries to represent certain complex conditions. If you have three
characteristics, A, B, and C, each with 20 possible values, and only 10 of each make a valid
condition, you will need to make 10x10x10 table entries in order to list every possible valid
combination.

Simple precondition / selection condition relationships, however (such as 1:1 relationships)


are handled very efficiently by variant tables.

In a procedure, a table lookup is used to set a characteristic value. Essentially, the first (n-
1) columns of the table behave as a database key, and column (n) contains the data to be
returned to your procedure. You will need to take care that the data in the first (n-1)
columns always form a unique key, or the procedure will fail to function properly.

Just as with class nodes, variant tables may not be used in conjunction with characteristics
planning. They also perform very poorly when you have too many tables, or too many
entries in a single table. The reason for this is that SAP creates an interpretation structure
on top of the variant table, and this structure must be resolved continuously throughout the
configuration process.

Variant Functions

Variant functions represent the ultimate in flexibility. They are also the ultimate in
customization. If you don't like how SAP behaves, you can do whatever you want by calling
an ABAP function module from within your dependencies. These don't work everywhere
(BOM selection conditions, for example, completely ignore them), so you will still have to be
a little bit creative if you use them.

Another variation of this is the PFUNCTION, which is a variant function that is specifically
designed to allow you to access the configuration data in a controlled manner.

Some people are naturally quite scared of variant functions, likening them to something
between a user exit and a core code change. It may feel like you are tinkering with the
inner core of SAP's configuration behavior, which certainly could not be good during
upgrades.
This is an unwarranted fear. SAP does not make a habit of providing user exits that are
detrimental to your long-term health as a cash customer. To the contrary, SAP provided a
pretty good, albeit incomplete, toolset to keep you safe, and actually shelter your model
from the threat of an upgrade. The companies that have used them well have been
exuberant.

I have seen some very elegant and impressive solutions put together around this
technology. To me, this is the mark of a real top-notch VC consultant, and I have met only
three of them over the last ten years. They do a little bit of work, and look like wizards. Not
surprisingly, I have heard on exactly these (and only these) implementations, 'you should
sell this solution to SAP.'•

To me, there is no wonder why the standard functionality in SAP is not as elegant as those
solutions. Those configuration models were all built around a particular concept of a product
model, and were all very unique in some way. SAP, on the other hand, has to support all of
the different kinds of models that may come along. So, they have to keep everything
generic, simple, and powerful.

The most powerful tool they provide in VC is the variant function. If used creatively, you can
build a model where you only have one precondition, one selection condition, and a handful
of procedures. Every BOM item can use the same selection condition, and every
characteristic can use the same precondition. No more complex dependency logic. Instead,
there are central programs which are maintained by IT, and some tables which are
maintained by the business users that actually understand the products.

To be sure, it will take some time to build such a solution. If your knowledgebase was built
well, the parameters for your design should be fairly clear. You will have to exercise some
creativity to convert your dream model into a reality, but you will be working towards the
dream model.

The variant functions themselves are not all that hard to write - they are typically quite
simple as function modules go. The trick is making a maintenance system to simplify the
maintenance of your now tabular data. You may spend several weeks to several months
setting up this process. It will be yours, though, so it will not be heavily impacted by
upgrades.

Again, the standard characteristics planning tool will not work using variant functions.

Reference Characteristics

One last tool that should not be underestimated is the reference characteristic. This is a
characteristic that has a pass-through connection directly to transactional data.

The most common example is to make a characteristic that references the pricing field
SDCOM-VKOND. When a value is entered in the reference characteristic, it is also placed
directly into the internal table SDCOM, in field VKOND. This is then used by the pricing
engine to determine the price for a particular option.

But, reference characteristics may do much more, like update the quantity on a BOM item,
or increase the labor required in a routing operation. They can also be used to read
information from the transaction, like any piece of information on a sales order header, BOM
item, or operation item. This additional piece of information may be just what you need to
make your solution complete.

A Little Promotion for Tables and Function Modules

The benefits of using these two tools will become clear rapidly.

First of all, your implementation schedule will probably be shorter and cheaper. In most
cases, the data from your knowledgebase documents can be uploaded with minimal effort.
In one company I have worked with, they spent 18 months developing a product model for
their second go-live division. Between #2 and #3, we simplified the process using pieces of
the model I described above. Even with this extra work, we finished setting up the VC data
in #3 in six months. Division #4 needed only two months (thanks in part to an increased
understanding of the legacy data), and it was just as complicated a product as #2. This is a
serious savings.

Second, your data will be cleaner. By using an upload, initial error rates are reduced
dramatically, and the shorter cycle time translates to more time spent validating the results.
Referring back to my earlier example, divisions #3 and 4 had an error rate of 10% of what
was in #2, and #2 was considered pretty good. In fact, sales orders out of #4 were far
cleaner immediately after go-live than they were ever before, which is a rare event for an
SAP go-live.

Third, your IT support costs will be much lower. By minimizing the complexity of
dependencies, you will have more ownership by users, and you will have fewer points of
failure in your solution. A fix in one place will fix many problems. A simpler model also
means less time helping users write dependencies, resulting in increased capacity to
address more critical productivity issues. Also, because you are not overly tied to
dependency logic, you have reduced exposure to upgrades - simply create a couple of
functions and/or tables, and you can rapidly migrate your entire solution into another
language (maybe Java) if need be. That is not the case if you have 15,000 dependencies
that must be individually tested (if not rebuilt from scratch).

Fourth, business flexibility will be greatly improved. Because training will be much simpler
and faster, the company's exposure to turnover will be greatly mitigated. And, placing more
ownership of the product model in the hands of the business owners of the product, you can
react much more appropriately to changes in market demand. Where you may have needed
a week to coordinate a change with IT, you may be able to make it in minutes by tweaking
a couple of table entries.

Characteristics Planning

Characteristics planning is hit so hard by all but the simplest solutions that it deserves some
consideration on its own. It simply does not work unless you use very generic
dependencies.

If it works for you, great. If not, did you waste your time so far? No. You found out that the
classic SAP solution does not support your business needs, and you did so without making a
flawed model and discovering the flaw after go-live.
The good thing is that you can still plan. You will just have to do some analysis to figure out
how.

You may find one of the other standard planning strategies is quite sufficient. If so, your job
is easy, and you have no worries.

Or, you may find as I often have that your solution already contains all the keys for
planning, and a custom planning system built around standard functionality is a very minor
custom development. You will just need a bit of technical acumen to pull this last method off
without creating undue impacts on your system upgrade strategy.

Engineering Change Management

Finally, be sure to give plenty of consideration to your use of engineering change


management. You may not think you need it, but if your product changes, you will. The
impact from not using it can mean failure in mission-critical applications such as production
and invoicing. There are alternatives, but I have not seen a single one that took less than
twice the effort of ECM to use.

Setting up ECM in a non-custom VC environment is not very difficult. Standard SAP objects
are already built to be linked with it.

The problem is, using custom anything with ECM is not native to SAP. You will need to link
your custom tables, etc, with ECM, so changes there can be grouped together with BOM
changes, material changes, and so on.

It is not as difficult as it sounds, though, and SAP has done a pretty fair job documenting
how to do this. What you will need to take particular care with is that you build ECM into
whatever tables you create. This will be discussed in the VC knowledge corner (someday
soon, hopefully!).

3. Multi-level Configuration Approach Considerations

You may find that one produced configurable material contains a number of configurable
subassemblies, or that a sold configurable material is really a kit of sorts, which are
themselves deliverable (and possibly configurable) materials. These are but two examples
of multi-level configuration.

Multi-level configuration models tend to be considerably more complex than those for
single-level configurations. You should not, however, be intimidated - SAP handles most of
these quite well, if you approach the product modeling with a little caution.

Where to Start

Before you begin classifying anything, you will need to perform additional analysis regarding
the sales and logistics processes for the product you are modeling. Your model must
support the way the product is sold, built, costed, shipped, and invoiced. Perhaps, in your
analysis, you may find that some critical functionality is not supported by existing business
practices, and these practices will need to be changed. Obviously, this will need to be done
before you go off modeling the product.

What, specifically, should you look for? Well, it's not easy to give a straightforward answer,
because the questions you should ask will vary based on the business model you are
supporting. But, below are some common considerations. This is obviously not going to be a
complete list; it is intended to serve only to get you thinking about the product model you
need to be working toward. The questions and answers here are very particular to your
company's business model, so you will need to perform some due diligence in your analysis
at this point.

Once you have a picture of the logistics model you will support, you can begin to model the
product itself according to your picture.

Multi-level Structures in Sales

Is the product sold as a single line item, or a kit of parts? How about production? Shipping?
Invoicing?

If you find that you must produce a number of separate final assemblies, then you should
represent this by multiple lines on your sales order. This will facilitate shipping, and allow
you to short-ship if necessary. You can achieve this effect through the use of a sales BOM
explosion. That is, in the bill of materials for the top part, you make sure that certain items
are marked as relevant for sales, but not relevant for production. Then, you set the
structure scope in the item category of the sales order line to support sales order
explosions. The main configurable material will become the higher-level item in the sales
order, and the kit (for lack of a better term) will consist of one or more sub-items.

SAP has a couple of standard item categories created to support sales explosions, namely
TAC and TAP. They are a decent starting place, but I have found that there are often
enough necessary changes to these to warrant making your own versions.

Kits are quite a topic by themselves. You will need to consider many things when using kits,
which are not necessarily VC-related. I will try to touch on some of these, but I will not
delve into detail unless the detail somehow is relevant to a multi-level VC configuration.

Multi-level Structures in Production

If your top-level product is delivered complete, but it consists of multiple assemblies which
are also configurable, you may want to consider using collective orders in conjunction with a
multi-level configuration.

A collective order is defined by a leading order (the top-level configuration) and a group of
one or more trailing orders (the lower-level configurable assemblies). Each order is a
separate production order, but the trailing orders are not issued to nor received in
inventory. Instead, they are directly consumed by the leading order. Also, as the schedule
of one of the orders changes, the entire collective order is moved correspondingly.

Advanced Mode Configuration in IPC

There are two reasons you might be using IPC. The first, and most obvious, is that you are
exposing your products online. This alone should never be enough of a reason to model in
advanced mode. Many problems will arise, as you will see below as you read on.
However, some products simply cannot be modeled effectively in classic-mode VC in R/3.
For example, imagine a custom computer manufacturer developing servers in a rack-mount
array. One rack may have between one and 25 different configurable blade servers. In
classic R/3, for a 25-blade order, all you could do is create a line item for the rack, and 25
separate line items for each configurable blade server. But, because your custom blades are
designed to interact with each other in a very special, predefined way, each blade's
configuration must be visible to the other 24 blades' configurations. Also, you physically
assemble all 25 blades, and you need to capture the production activity.

Advanced mode modeling solves this quite well, as you can have multiple instantiations of a
blade configuration within a single rack configuration (R/3, on the other hand, currently
cannot perform such dynamic instantiations). In this case, you could configure this blade
server in advanced mode in IPC, and communicate the resultant configuration back to R/3
in terms of a completely defined order BOM. The interface is a bit tricky, but it is feasible.

Ultimately, in R/3, you will end up simulating one of the other two models above in order to
fulfill the order. You might find yourself programmatically constructing a sales BOM
structure (add each blade to the sales order as sub-items of the rack). Or, you might find
yourself programmatically creating a collective order structure (add each blade to the rack's
assembly order, and configure it in the production order).

Order Processing Implications

For classically-modeled multi-level configurations, you will need to consider your


presentation to the user. It is beyond the abilities of most individuals to be expected to drill
into the results of a top-level configuration in order to complete lower-level configurations.
SAP's interface is just not very intuitive here (IPC does much better, so I do not consider it
as much of a concern). You should always make sure the entire configuration can be
resolved based on the user input at the top level.

If you have many characteristics or levels, and a single list is confusing, consider using
characteristic groups (defined in the interface design in CU50). These will allow you to group
characteristic into logical units, and present a very intuitive tabstrip format to your users. It
isn't fun to maintain the user interface in CU50, but it is standard functionality that takes
little real effort to take advantage of.

Inheritance

You also will need to consider how you will carry the knowledge from the upper-level
configuration down to the lower level configurations. SAP does have a concept of inheritance
where values are copied down if the same characteristic exists at both parent and child
level, but it does not work very well on its own. Sometimes values just don't carry down.

So, you will need to create a procedure or constraint to copy values from one level to the
next, and assign it to the configuration profile of each child configurable material. It will help
tremendously if you are consistent in your characteristic design - if 'color'• is needed at the
child level, make sure you assign the exact same 'color'• characteristic to both the parent
and child.

Be careful when using class hierarchies, though. These are widely misused, and cause
needless complexity to many product models. What they are intended to be used for is to
limit option availability based on product relevance. They achieve this through the
denormalization of characteristic values. For example, class 'building'• may have a
characteristic 'framing method',• with allowed values 'stick frame',• 'concrete',• or
'steel'.• Another class, 'Residential',• also has characteristic 'framing method', but it has
allowed values 'stick frame',• 'concrete',• and 'timber log'.• Now, class '1-family house' is
assigned to both of the other two classes in CL24. The result is that the 1-family house only
allows 'stick frame'• and 'concrete'• framing methods. Inheritance from two separate
classes causes only the intersecting values to be allowed.

It can be very powerful, but is not primarily intended to drive multi-level configuration. It is
primarily a master data construct, and not a transactional one.

Often, a maintenance-conscious analyst will see the class hierarchy as a great method to
allocate classification to materials. The idea is, if there is a set of characteristics that are
used in many different materials, place this subset of characteristics in a class, and assign
that class to each material's main class via transaction CL24. Thus, the material classes are
themselves classified.

There are many problems with this approach. First, it is very abstract for the average user,
and even the concept may be hard to get (if it didn't make sense when you read it, great -
skip to the next paragraph). Second, it plays havoc with concepts of inheritance, as it mixes
the class hierarchy's intended use with transactionally-driven inheritance. When a lower-
level configuration is processed, it attempts to carry values down from the parent
configuration. The class allocation can easily cause unintentional restrictions to be applied at
the same time. Thus, the master data interferes with the transactional data during run-
time. Third, because the classification is not actually carried down, you necessarily give up
polymorphism (an abstract representation of reusability). If your goal is to put the
dependency maintenance in the hands of your users (to at least some degree), it helps to
have simple dependency templates which they can apply. Using class inheritance, this is not
possible, as you must consider inheritance and the specific configuration position in a
complex classification hierarchy.

The final argument against using class hierarchies in this way is that there is a simple
alternative which a user can understand, and which does not introduce new inheritance
concepts. Instead of allocating one class to another in CL24, assign both classes directly to
the material. This works well in multi-level configuration, as the entire classification of the
sub-assembly can also be assigned to the top-level material. Thusly, the entire
configuration is knowable at the top level, providing you with maximum solution flexibility.
Also, since the classification is physically assigned to each material, you can create truly
polymorphic dependencies and/or reusable dependency templates.

There are at least two catches to this approach: you will need to force inheritance via
constraint or procedure (as mentioned earlier), and you will lose the restrictive capabilities
of the class hierarchy. But, unless you have a dedicated VC support staff with a very clean,
normalized implementation of class hierarchies, you won't want to use them, anyway.

Scheduling

The first potential problem with a kit is making sure that that sub-item schedules are
aligned so that the kit can ship complete. Easy enough - use a delivery group. A delivery
group forces the items in the kit to schedule together by moving all of the sub-items'
confirmations out to the confirmed date of the sub-item with the longest lead time. This will
cause the proposed production start dates to be staggered accordingly. SAP will not,
however, require the items to actually be built or shipped together - it is a scheduling tool,
not a hard requirement for logistics execution. You can either have SAP automatically
propose a delivery group for the entire kit (via item category customizing), or you can
manually group the items on the sales order.

For collective order processing, SAP handles all the scheduling for you, so you have no
further special concerns in this regard.

Pricing

In each multi-level model, pricing must be considered early on in your design.

With kits, you may find that you need to be able to invoice at the sub-item level, just in
case you have to short-ship for whatever reason. This will cause problems if your company
is used to billing only the top item. But, if short-ships are an undesirable but unfortunate
reality for the product, you will have little choice but to somehow support pricing at the sub-
item level.

SAP has many ways to support this. For example, you could roll up the pricing that is
presented to the customer in an output document. Or, you could use cumulative pricing
conditions, so that the sub-item prices roll up to the top item as long as the items stay
together.

Of course, you could also choose to adjust your pricing policies. One thing you should NOT
do, however, is to try to use variant pricing at the top level, and some other pricing
mechanism at the sub-item level (should the sub-item ship separately). This will invariably
lead to mismatches between order and invoice pricing, and unpleasant run-ins with your
customer.

On the other hand, if the kit must always ship complete, you can continue to price at the
higher-level item without causing any problems.

For collective order structures and for kits with header-level pricing, you may need to find a
way to carry option prices up from the lower-level configurations, and into the top-level
configuration. As mentioned before, cumulative conditions can be helpful for kits. But, this is
not helpful for collective order structures. In this case, you will have to find a way to carry
the variant keys from the lower-level configurations up to the top configuration, and
maintain prices for the top-level configuration. Provided you can make sure the
configuration is fully known at the top level, and carried down to the lower levels, you may
be able to simply set the variant keys at the top level.

However, if the configurable subitems / subassemblies may be sold individually in some


cases, you may find yourself supporting two price lists for the same options. The simple
solution for this? It depends - if the pricing model is the same for a subassembly whether it
is installed into the top assembly or shipped standalone, you can use pricing reference
materials. Always refer the lower assemblies to the option price list for the top assembly,
and then even when sold standalone, they will pull consistent pricing. But, if the pricing
models are different, you will really have to maintain the two price lists, anyway, so don't
fret.

For advanced mode configurations, you may face an additional challenge. Because you
cannot trigger the variant keys within a configuration made directly on a production order,
you will have to assign them programmatically. I have not yet found a good way around
this.
Profitability

Profitability reporting for a collective order model is clean and fairly straightforward as
assembly costing goes. For kits, though, you will have another issue to address - costs post
at the inventory level, which happens to be at the sub-item level for a kit. Unless you also
price at the sub-item level, your costs and your revenues will not line up. You will need to
resolve this situation.

Assuming you cannot change your pricing model to price at the sub-item level, you will
probably need to create a custom settlement profile and associate it with your sales order
items (via the items' requirements classes), so that their costs always settle to the higher-
level item. You could also use WBS elements, but then you will be introducing project
systems into your solution. If you are already using PS with your configuration, then WBS
will handle the profitability postings quite naturally.

Procurement Considerations

For the sake of discussion, I will use procurement literally, to include both production and
purchasing. The main consideration here is how far you will perform the sales BOM
explosion. The simple production test is: if you need to pick it, print a label for it, and
identify it in shipping, you will need it to be stocked separately at time of delivery, and you
will need it to be listed on a separate line in the sales order.

Also, if you have a vendor drop-ship a part directly to the customer, you will need a
separate line item in order to trigger a purchase requisition for the item.

Documents Sent to Outside Parties

One last key consideration for multi-level configurations is how you will present them to
your business partners, such as customers. Will you list the configuration options on the
forms? If so, under which parts? What about pricing? Will the pricing print next to the
options? What else might affect the presentation of the configuration to your partners?

These should not be difficult questions to answer, but they should not be put off. If your
documentation does not match your process, you will most certainly have confused and
upset partners. Usually, you should not modify your configuration structure to meet these
requirements, but it does become necessary on rare occasions. Whenever possible, you
should modify output routines and other non-core applications to support the business
structure, and not the other way around.

Planning

In multi-level configurations, you will need to give some additional consideration to your
planning strategies. You may or may not be able to plan all the lower-level assembly
demands by way of the top-level configurable material.

Why is this significant?

If you do not choose to use characteristics planning (or some version of it), you will need to
make sure your sales orders contain enough information to consume planned requirements.
This could result in you having to add dummy / statistical items to your sales orders, which
will lead to additional customizing and training requirements. You will also have to take
particular care that you do not pass consumption requirements at both the top-level and
lower-level configurations. If you do, then you may inadvertently duplicate component
requirements, as the production of the top-level assembly may also generate dependent
demand.

But, if you can plan all of your component demands in terms of option probabilities, you can
align your planning more closely with long-term forecasts that include marketing
information. This will put you in a position to take advantage of some form of
characteristics-based planning, even for multi-level configurations. An added advantage of
this is that because you will be utilizing SIS to track historical usage of particular
characteristic values, you will not need any otherwise meaningless statistical line items on
your sales orders, resulting in cleaner orders and simpler fulfillment processes.

In advanced mode environments, however, you will need to modify your approach to
characteristics planning. Because the entire configuration can never be known in R/3 from
the top level sales order item, you cannot plan simply on characteristic values set at the top
level. Instead, you will need to post usage history based on lower-level configurations, and
ultimately, you will need to plan at these lower levels.

Type Matching / Class Nodes

One more point to consider with multi-level configurations is type matching. Type matching
allows you to automatically substitute a stocked material for a configured one when a
stockable variant of the material can be identified. For commonly stocked assemblies, this is
a must activity in order to inventories, cycle times, and costs down.

In a single-level configuration, this is very straightforward to implement - create the


configurable material, then create the stocked assembly, and then classify the stocked
assembly as a variant of the configurable one.

In a multi-level configuration, though, you will need to consider where the type match is to
take place. If there are multiple levels in a sales order, do you really want the top level
material to have a type match? Or, do you just want the sub items to type match?

Probably most significantly, if the additional configurable levels are in a production order, do
you expect a type match to occur there? If so, you will be in for a mildly rude surprise. SAP
will propose a type match if you ask for one, but will not automatically replace a
configurable material with a stocked one mid-explosion. So, you will either have to do this
manually (or via custom interface), live without any type matching in the production order,
or use class nodes to simulate type matching.

One problem with using class nodes in this way is that nothing (including the class node
itself) will be selected if it does not resolve to a part number. If you cannot create every
possible resultant material in the master data, then you will likely need to create a
configurable material that exists beside the class node in the higher-level BOM - either the
class node is resolved, or the configurable material remains.

Generally, class nodes should be used sparingly in a multi-level configuration. Although it is


possible to resolve an entire class hierarchy from a single-level configuration if need be, you
will not be able to guarantee inheritance, use functions, or use variant tables, because a
class node does not have a configuration profile for you to place these kinds of
dependencies on.
One possible negative result is that you may find yourself leaping levels of a configuration to
carry configuration knowledge down to where it needs to be. It can be made to work, but it
will be confusing, and not maintainable by the average user. Your model will be inefficient,
and carry a high support cost.

Where class nodes add tremendous value is in minimizing the size of the super BOM.
Particularly, this is when you do have a non-configurable part number to represent each
possible configuration or classification of a particular configurable assembly. Then, you can
simply assign a single class to the BOM instead of the entire list of part numbers. And,
instead of writing selection conditions for each super BOM component, you simply classify
each part in CL24.

A Simplified Approach

The approach listed below is only one of many possible, but if you do not yet have
experience supporting an organization that is using VC in a live capacity, it is a safe one to
fall back to.

First, do not to try to handle the entire multi-level configuration in one fell swoop. Work
each level of the configuration separately, and then blend the levels together. Take each
lower-level item, and fully define how it should behave as a standalone configurable
material. Then, consider their parent materials minus the sub-items.

That means, construct a super BOM for each parent material, and remove your lower-level
configurations from that BOM. Create classes for each level that contain only the
characteristics needed to fully resolve the configuration for that level. Just be sure to use
characteristics consistently - if you think you will need 'color'• at more than one level, use
the same characteristic for 'color'.• You can place dependencies on the values if need be.

The point of this first exercise is to make sure each level is fully consistent and fully
normalized. It is very easy to clutter a multi-level configuration by attempting to place
characteristics at different levels all at once, and trying to use constraints unnecessarily to
communicate between the various levels. Variant configuration works best when
approached with an object-oriented mentality. Take it one step at a time, allocating one
configurable object to another, and your total model will end up much cleaner, and will be
done much faster in the long run.

Second, after you have consistent models at each level, you can begin to allocate the
configurable children to the parents. Before you drill into the relationships between the two,
focus on making the logic on the parent BOM consistent. That is, work out your selection
conditions on the configurable components, etc. All you want to do at this point is make
sure the configurable component gets selected under the proper circumstances.

Next, work on carrying the selection logic downward during configuration. The most efficient
way to do this is to allocate the component's class to the parent material. This will likely
mean you will have some redundant characteristics, existing in the parent material's class
and its child's class, both assigned to the parent material. However, if your characteristic
naming conventions are very consistent (i.e., you used the exact same characteristic at
both levels), and they should be, SAP is capable of merging these during configuration, so
you will not see any duplication. If you can't get the merge to work, change your classes,
characteristics, and values until the merge is right.

Now, you can put the pieces together into one cohesive unit. You will need a constraint or
procedure on the configuration profiles of each of the sub-items to copy values downward
(something like $SELF.COLOR = $PARENT.COLOR IF $PARENT.COLOR SPECIFIED). By using
the same characteristics at each level, SAP may inherit these values downward for you, but
it is a bit hard to understand why it works sometimes and not other times, so it's better to
be sure and copy the values yourself.

If you made your model well, it will probably work in CU50 now. But, you aren't finished.
Can you build and sell this thing? Do the prices appear at the right point in a sales order?
Do the costs post properly?

These are the questions that you should have raised earlier, and if you know your model is
right, you should be able to work out the answers or solutions quickly. When you are done,
it should feel like you have really modeled the product as simplistically as possible. If it
doesn't feel right, it probably isn't.

You might also like