Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 14

BAHIR DAR UNIVERSITY

BAHIR DAR INSTITUTE OF TECHNOLOGY


FACULITY OF COMPUTING
DEPARTMENT OF SOFTWARE ENGINEERING
Internet Programming Individual Assignment
By: Robel Asfaw
ID:1102401 Submitted to: Mr Mebiratu
January 1/20/2022
BDU, Ethiopia
1. write in detail the advancement of web programming
model based different metrices
Introduction. The research community began to work in a new area of software engineering in the early
1990s, focusing on the unique properties of the Web environment.

Several approaches, such as HDM (Hypermedia Design Model) and OOHDM (Object-Oriented
Hypermedia Design Method), provided new techniques, models, and notations for "hypermedia"
systems in general and Web-based systems in particular. Several strategies, approaches, and techniques
have been proposed in order to deal with certain problems of Web development throughout the last
decade. Navigation, complex interfaces, difficult maintenance, security concerns, and unknown remote
users are just a few of the key issues that Web-based system programming faces. In an appendix to their
research on the practical application of Web development strategies. In, you can find a summary and
comparison of the more well-known of these Web development methodologies, from which the
chronological "map of the land" is derived. Although some of these systems, such as HDM (Hypermedia
Design Method), are no longer in use, the basic concepts and principles on which they are founded are
still important to the Web development community. Entity Relationship Modeling was used in a number
of early techniques, including HDM and RMM (Relationship Management Methodology).

The influential publication of OOHDM (Object-Oriented Hypermedia Design Method) marked a


significant shift. This methodology is built on both HDM and the object-
oriented paradigm, and it provides a systematic approach to hypermedia system design and im
plementation. 
OOHDM made a significant addition to the field of Web engineering research, and many of its id
eas have subsequently gained widespread acceptance. 
OOHDM advocated breaking down hypermedia design into three models: a conceptual model, 
a navigational model, and an abstract interface model, each of which represented a critical feat
ure of hypermedia systems. 
This concept of isolating different components of hypermedia systems was groundbreaking at t
he time, but it is now widely adopted by the Web engineering research community, allowing a s
ystem's complexity to be split down into independent layers.
A modification in the navigational model affects only the navigational model in OOHDM, whereas the
conceptual model does not require any changes. Another key element of OOHDM was the use of class
diagrams to describe not only the conceptual model but also the operational model.

An expansion of the basic class diagram is used to create the navigational model. Additional
considerations

such as navigational issues, which could not be simply or completely articulated using class diagrams
A supplemental notation could be used to model context or abstract interface diagrams.

OOHDM proposed it. Following OOHDM, other approaches were proposed, each offering new concepts,
models, processes, and methodologies tailored to the needs of interactive hypermedia systems and the
Web environment. Hypermedia systems evolved into full-fledged Web-based information systems over
time, and these methodologies were modified to meet the new challenge; for example, HDM
(Hypermedia Design Method) evolved into HDM2, which mutated into HDM2000/W2000, and
ultimately led to WebML. This raises the following questions: Why is it that there are so many different
approaches? Is there no such thing as a standard? Each approach focuses on a different element and
presents models, approaches, and languages that are appropriate for that aspect. For example, WSDM
(Web Site Design Methodology) is primarily concerned with the design of Web sites from a user-
centered standpoint. It presents a unique strategy to dealing with various audience classes and roles,
and it is one of the most intriguing ideas in this regard. However, in terms of navigational and
conceptual models, OOHDM and EORM (Enhanced Object-Oriented Relationship Methodology) are
relatively comparable. Despite the fact that it has a different vocabulary and modeling notation. Some
technologies, such as WebRE, have been created more recently to try to tackle this incompatibility
problem. WebRE is a W2000, NDT (Navigational Development Technique), UWE (UML Web
Engineering), and OOHDM-based technique for dealing with Web requirements. As a result, while the
UWA Project or WebML (Web Model Language) examine requirements definition and implementation,
they place a greater emphasis on the analysis and design phases. As can be observed, the bulk of Web
development approaches are focused on the analysis and design phases, with the remaining phases of
the life cycle receiving notably less attention. The lack of integrated toolsets to support development
methodologies and approaches, a long-standing issue noted to some years ago in, is one aspect of Web
engineering that remains problematic. Due to the continuous changes in Web systems, as well as the
progress and coverage of the most well-known Web development.

Web development strategies must be highly agile in order to deploy fully working upgrades fast and
often. As a result, CASE technologies that provide automated procedures and allow for rapid
development/refactoring are required. Approaches like UWE, which offers the MagicUWE tool, and
WebML, which is supported by the WebRatio tool, have received a lot of attention in recent years.
Nonetheless, a mechanism to assist the translation and consistent integration of semantic metamodels
is required for CASE tools to be interoperable and interchangeable between and across Web
development methodologies. In this aspect, MDWE holds a lot of promise because it has the potential to
allow Web developers to mix and match method fragments from other techniques to create a
customised hybrid that is tailored to the demands of a certain development project. This research takes
a careful look at this potential, examining if existing approaches can be simply integrated or extended
with new ones.

Web development strategies can become more interoperable, and lexical differences as well as a lack of
connection between different approaches can be addressed.ANALYSIS OF MDWE METHODOLOGIES
417
Web developers must mix-and-match aspects from di fferent approaches, hence the
need for methods that are compatible and interoperable.
• There still remains a lack of tool support for Web development methodologies, and
conversely a lot of development tools lack methodical analysis/design components,
so there is a bilateral disconnection between development tools and development
methodologies, especially between analysis/design and implementation.
All of these issues can be addressed to some extent by adopting a model-driven development paradigm
such as MDWE. This research makes a unique contribution because it primarily examines approaches
based on the model-driven paradigm. Concepts are the most important thing in MDWE, regardless of how
they are represented. MDWE promotes the use of platform-independent metamodels for the
representation of concepts.
The development process is aided by a set of transformations and relationships between ideas that
facilitate agile development and ensure model consistency. In several software fields, the model-driven
approach is being applied with great success.
engineering and research and development This shows that it may be used in Web engineering as well.
It's also showing promise in the field of programming languages. As a result, certain
MVC (Model View Controller) frameworks are key frameworks for developing Web systems.
troller) make it simple to create Web apps. Struts , Django , and Ruby
on Rails are two instances that come to mind. They're open source Web application frameworks that use
the MVC design and combine simplicity with the ability to construct Web applications with as few lines
of code as feasible and a simple configuration.
In reality, MDE serves as the foundation for these systems.
MDE has lately been employed in the testing phase as well. TDD (Testing-Driven Development) is a
relatively new study direction that has yielded significant breakthroughs. The use of transformations to
derive test cases, as well as the definition of metamodels to express test aspects.

2.2. Model-Driven Web Engineering (MDWE). In Web development approaches, it refers to the
adoption of the model-driven paradigm. It aids in the construction of models at a specific step of the
process by combining the knowledge gained in earlier stages.

with the previously created models. A development team might be interested in using NDT's requirements
approach to capture business information, as well as UWE's analysis and design phases and WebML code
creation. If a proper set of metamodels and transformations is defined, this is conceivable. Following that,
transformations might be used to obtain UWE analysis and design, and the process could be repeated
using WebML code creation. This hypothetical scenario might allow a developer to benefit from the
advantages supplied by each strategy separately, and the problem of lack of full lifecycle coverage could
be addressed by the synergy generated by merging different elements of different ways.
The quality of both the metamodel and the transformations is obviously critical in generating appropriate
outcomes. It is difficult to define a common metamodel, and it is important to acquire a high level of
abstraction in order to define concepts and find common notions. If using tools is required in Web
engineering, it is also required in MDWE.
Transformations should be carried out automatically, without the need for the development team to apply
them manually. As a result, SmartQVT] and Moment are two excellent examples. It's worth noting that
these tools are methodology agnostic because they're based on standards like Figure 2. ANALYSIS OF
MDWE METHODOLOGIES 419 Using common metamodels to make methods interoperable, such as
UML profiles and QVT languages As a result, any of these new tools will work if a metamodel for any
Web development approach is defined using standards.
The single issue that MDWE cannot directly address is the lack of practical applications.
2.3. Model-Driven Architecture (MDA). The Object Management Group (OMG) defined MDA as the
standard Model-Driven Architecture in 2001. Its goal is to define a standard architecture for the MDE
environment. Four levels are envisaged in MDA:
• CIM (Computer-Independent Model): Concepts that capture the system's logic are defined at this level.
• PSM (Platform-Specific Model): At this level, computer-executable models are defined that are
dependent on the development platform, such as Java or.NET models.
• Code: This is the highest level, and it includes the system's implementation.
Some transformations can be defined between these levels in MDA. As a result, CIM-to-PIM, PIM-to-
PSM, or PSM-to-code are all options. MDA also allows you to define conversions on the same level, such
as PIM-to-PIM. The MDA standard is used by the majority of these techniques to define their
metamodels and transformations, however they generally focus on different levels of the MDA standard.
3.1. OOHDMDA One of the most essential approaches in Web engineering is OOHDM. It was first
introduced in 1995 and included essential concepts such as dividing the design of a Web system into three
models: conceptual, navigational, and abstract interface models. Several subsequent approaches used this
concept.
OOHDM's original scope was limited to design and execution. However, it was eventually supplemented
with a technique for dealing with needs known as UID (User Interface Diagrams). OOHDMDA is an
MDE approach based on OOHDM. Starting as a PIM model
designed with OOHDM, a servlet-based PSM is generated. OOHDMDA provides a Web
application design with a UML-based design tool using the conceptual and the navigational model of
OOHDM. With this base, the approach starts with the XMI file generated
from the tool. Both models are enriched with behavioral semantics that are obtained
from behavioral model classes incorporated in the approach. With this PIM XMI file,
the approach defines some servlet-based transformations in order to obtain a PSM-XMI
file with specific servlet technology. As a result, starting with OOHDM and finishing with servlet
technology, the strategy provides various PIM-to-PSM transitions.
Despite the fact that the technique is built on MDE, OOHDMDA does not have its own PIM metamodel.
Of course, MDE simply means that a development strategy employs models and transformations without
requiring the presence of a metamodel; transformations are not always from metamodel to metamodel,
but can also be from model to model. OOHDMDA leverages the OOHDM meta-model as a natural
extension of OOHDM. In a UML-based design tool, OOHDM concepts are defined as stereotypes, and
transformations are generated using Java. OOHDMDA includes two specific metamodels at the PSM
level: a servlet-based PSM for dynamic navigation and another for advanced navigation. OOHD- MDA,
on the other hand, focuses on the PSM level, and while there is a break from standards in the definition of
the PIM metamodel and transformation, the method is practical and illustrated examples can be found in.
Furthermore, the OOHDMDA development approach's utilization of tools provides an appropriate setting
for actual application.
3.2. WebML. WebML is a notation for specifying the conceptual design of complex Web pages,
according to its creators. Its development process begins with the system's conceptual modeling, which is
accomplished through the use of a data model. WebML does not define its own notation at this point,
instead recommending common modeling tools like Entity-Relation diagrams and UML class diagrams.
The definition of a hypertext model is the next step in the procedure. Hypertexts that can be published on
the Web are described in this model. A perspective of the Web site is defined by each "hypertext."
Hypertexts are characterized using two models: the composition model, which defines the system's pages
and "content units," and the navigation model, which describes how these pages are navigated. The
presentation model, which defines the actual appearance of the Web pages, is developed next. Finally, the
customization model emphasizes the importance of tailoring the system to each user's function.
WebML offers a CASE tool called WebRatio, which allows the recommended methodologies to be used
systematically. This is one of the most interesting contributions of WebML. It aims to make the
integration of MDE approaches into Web modeling languages as simple as possible. They offer a semi-
automated method for creating MOF-based metamodels from DTD (Document Type Definition) files.
These metamodels are also separated into packages based on the initial definition of WebML metamodel
analysis and design: Hypertext Organization, Access Control, Hypertext, Content Management, and
Content. In order to describe limits in metaclasses and relationships, some OCL constraints are also
added. . Some transformations are defined in this approach in order to derive WebML metaconcepts from
DTD ideas. These changes allow the solutions to be reused to address some of the flaws identified in the
original.

DTD is used. A matching matrix is used to define transformations in an informal method.

as part of a Metamodel Generator (MMG).


3.3. NDT:- NDT (Navigational Development Techniques) is a methodological technique developed by
MDWE that focuses on requirements and analysis. NDT defines a set of CIM and PIM models, as well as
a set of transformations to derive PIM from CIM using QVT.
These metamodels are defined using class diagrams, like in other ways.
The NDT requirements metamodel is a WebRE expansion that incorporates new concepts based on the
WebRE approach. For the PIM level, it also includes two metamodels: content and navigational. The
UML metamodel for class diagrams is the first, while the UWE metamodel is the second. The tool
support is one of the most essential features of this methodology. The NDT MDE development process is
supported by a collection of tools called NDT-Suite, which consists of four tools. Each NDT metamodel
has a unique profile that Enterprise Architect [53] implements. The NDT methodology has customized
the tool's interface with a set of tool boxes that provide direct access to each methodology artifact. NDT-
Profile is the name of this environment. NDT-Suite also includes the following six tools:
1. NDT-Driver: A tool for performing NDT transformations. NDT-Driver is a free Java-based
program that implements NDT's QVT Transformations and automates the generation of analysis
models from requirements models. Despite the fact that NDT transformations are totally defined
in QVT, they are implemented in NDT-Driver with Java, which is ideal for academics working
on industry-based projects.
2. 2. NDT-Quality: A tool for evaluating the quality of a project created using NDT-Profile. It
generates an objective project evaluation and determines whether the methodology and MDE
paradigm are correctly used. NDT-Quality includes a test rule file that verifies the use of QVT
transformations in an NDT project for this purpose.
3. 3. NDT-Report: A tool for creating official papers that are reviewed and validated by end users
and clients. For example, it can generate a Requirements Document automatically based on the
format specified by clients.
4. 4. NDT-Prototypes: A tool that creates useful prototypes based on NDT specifications. The NDT
technique has been implemented in practice on multiple genuine projects due to its high degree of
tool support, with transformations capable of being conducted automatically and assistance
offered for all phases of the development life cycle.
5. 5. NDT-Glossary is an automated technique that develops the first instance of a project's glossary
of terminology using the NDT-Profile tool.
6. 6. The only tool in the NDT-Suite that is not based on the MDE paradigm is NDT-Checker. This
equipment comes with a collection of sheets that are specific to each NDT product. In
requirements reviews, these sheets provide a collection of check lists that should be evaluated
manually with users.

3.6. OOWS is a Web technique that focuses mostly on the analysis stage. It's a Web extension of an
older methodology called OO-Method.based on the object-oriented paradigm and consists of three
models: a Structural Model, a Functional Model, and A functional model and a dynamic model. There
are two more models in OOWS that are exclusive to A Navigational Model and a Presentation Model are
used in web development. OOWS is built on model-driven development, according to a recent article.a
method for converting a web model into a series of prototypes To begin with, this task metaphors are
used to define requirements, and these tasks are then converted into a graph with the letters AGG
Analysis models are then created using graph transformations.

Standards and compatibility. One of the most important advantages of the


MDWE paradigm is the possibility of making various approaches compatible. MDWE
is focused on concepts and the way to deal with and represent these concepts is unimportant. However,
if a metamodel or a concept is defined freely without reference to a
common standard, the multiplicity of concepts can surface again as a problem, just as it
originally did in the Web engineering approaches of the 1990’s. If a metamodel or some
transformations were defined using a common language, the connection among approaches
could be easily facilitated.
To this end, the use of UML profiles offers very interesting results. A UML profile
is an extension mechanism offered by UML to extend the basic concepts of an MDWE
430 G. ARAG
´
ON, M. J. ESCALONA, M. LANG AND J. R. HILERA
approach. Thus, if an approach defines its own metamodel using a class diagram and
later defines a UML profile, then it offers a standard definition of its concepts that can be
understood by other researchers and groups. As examples of UML profiles, NDT provides
the concept of Storage requirements which is an extension of the UML class, while UWE
defines the Content concept, which is also an extension of the UML class. If both are
analyzed in each approach, we can conclude that they represent the same idea, although
they are named differently in each methodology. Extensions which are based on the same
UML concept give rise to opportunities for forward compatibility, thereby representing
an important step towards a common metamodel for Web modeling .

2.what do think static And dynamic webpage?


• In contrast to dynamic web pages, which are generated by a web application, a static web page (also
known as a flat page or a stationary page) is a web page that is given to the user's web browser exactly
as saved. As a result, a static web page displays the same information to all users in all contexts, subject
to current web server capabilities to negotiate the document's content-type or language when such
versions are available and the server is configured to do so.

• A server-side dynamic web page is one that is built using server-side scripting and is controlled by an
application server. Parameters define how each new web page is assembled, including how further
client-side processing is set up, in server-side scripting.

A client-side dynamic web page uses JavaScript to process the page as it loads in the browser. To query
and modify the page's state, JavaScript can use the Document Object Model, or DOM. Even if a web
page is dynamic on the client side, it can still be hosted on a static hosting service like GitHub Pages or
Amazon S3 if it doesn't contain any server-side code.

To alter some variable content, a person or a computer program must reload a dynamic web page. The
information that is being updated could come from the server or from modifications made to the DOM
of that page. This may or may not truncate the browser history or generate a saved version to return to,
but a dynamic web page update employing AJAX technologies neither creates nor truncates the web
browsing history advance of the shown page. The end user receives one dynamic page in the web
browser that is controlled as a single page, although the actual web content shown on that page can
vary. The AJAX engine is only responsible for requesting elements of the browser's DOM.

Although it has fallen out of favor with the popularization of AJAX, a name that is now rarely used,
DHTML is an umbrella term covering technologies and methodologies used to produce non-static web
pages. The dynamic web experience in a browser is created via client-side scripting, server-side scripting,
or a combination of the two.
3.Write in detail how web works and HTTP Response code?

 Computers connected to the web are called clients and servers. A simplified diagram of
how they interact might look like this:

Two circles representing client and server. An arrow labelled request is going from client to server, and
an arrow labelled responses is going from server to client.

• Clients are the internet-connected devices and online-accessing software that a typical web user has
(for example, your PC connected to your Wi-Fi or your phone connected to your mobile network)
(usually a web browser like Firefox or Chrome). Websites, sites, and apps are stored on servers, which
are computers. When a client device requests a webpage, the server sends a copy of the webpage to the
client machine, which is then displayed in the user's web browser. The client and server we discussed
earlier are only part of the story. There are other other components to consider. Let's use a real-world
example for a closer look. let's imagine that the web is a road. On one end of the road is the client,
which is like your house. On the other end of the road is the server, which is a shop you want to buy
something from. In addition to the client and the server, we also need to know.

• Internet connection: This enables you to send and receive data over the internet. It's essentially the
road between your house and the store.

• TCP/IP: Transmission Control Protocol and Internet Protocol are internet communication protocols
that describe how data should be sent and received. This is similar to the transportation systems that
allow you to place an order, go to the store, and purchase your items. This is comparable to a car or a
bicycle in our scenario (or however else you might get around).

• DNS: Domain Name Servers (DNS) serve as a website's address book. When you type a web address in
your browser, the browser looks at the DNS to find the website’s IP address before it can retrieve the
website.

• HTTP: Hypertext Transfer Protocol is an application protocol that establishes a communication


language between clients and servers. This is similar to the phrase you use to place an order.

• Component files: A website is made up of many separate files, which are similar to the various
elements of the merchandise you purchase from the store. There are two sorts of these files:

• Code files: Websites are typically developed with HTML, CSS, and JavaScript, while other technologies
will be introduced later.

When you type a site URL, what happens is that

1. The browser connects to the DNS server and obtains the true address of the website's server (you find
the address of the shop).

2. The browser sends an HTTP request message to the server, requesting that it deliver the client a copy
of the webpage (you go to the shop and order your goods). TCP/IP is used to send this message, as well
as any other data transmitted between the client and the server, through your internet connection.

3. If the server accepts the client's request, it responds with a "200 OK" reply, which implies "Of course
you can browse at that website!" "Here it is," it says, before transmitting the website's files to the
browser in little bits known as data packets (the shop gives you your goods, and you bring them back to
your house).

4. The browser assembles the little parts into a whole web page and displays it to you (the products
come — new flashy stuff, wonderful!).

Component files are parsed in the following order:

When browsers request HTML files from servers, the HTML files frequently include link> elements that
relate to external CSS stylesheets and script> elements that refer to external JavaScript scripts. It's
crucial to understand the order in which the browser parses those files when it loads the page:
• Because the browser parses the HTML file first, it recognizes any link>-element references to external
CSS stylesheets and any script>-element connections to scripts.

• As the browser parses the HTML, it sends requests to the server for any CSS files it finds from link>
elements and any JavaScript files it finds from script> elements, and parses the CSS and JavaScript from
those requests.

• From the parsed HTML, the browser creates an in-memory DOM tree, an in-memory CSSOM structure,
and compiles and executes the parsed JavaScript.

•A visual representation of the page is painted to the screen as the browser builds the DOM tree,
applies the styles from the CSSOM tree, and executes the JavaScript, and the user sees the page content
and may begin to interact with it.

DNS explained

The pleasant, memorable strings you punch into your address bar to find your favorite websites aren't
real web addresses. It's a series of numbers that look like this: 63.245.215.20 An IP address is a number
that identifies a certain place on the internet. It's not easy to remember, though, is it? Domain Name
Servers were created to solve this problem. These are special servers that translate a website's true (IP)
address into a web address you type into your browser (such as "mozilla.org").

Packets explained

Data is transmitted across the internet in thousands of little chunks. Data is delivered in little packets for
a variety of reasons. They become dropped or corrupted from time to time, and it's easier to replace
little parts when this happens. Furthermore, packets can be sent through separate pathways, making
the exchange speedier and allowing multiple users to download the same website at once. Each website
would be supplied as a single large piece, allowing only one user to download it at a time, making the
web inefficient and uninteresting to use.

HTTP response codes

A server sends an HTTP response to a client. The response's goal is to either supply the client with the
resource they requested, or to alert them that the action they asked was completed, or to inform them
that an error occurred while processing their request.

HTTP response status codes show whether or not a particular HTTP request was completed successfully.
The responses are divided into five categories:

1. Informational responses (100–199)

It means the request has been received and the process is continuing.

2. Successful responses (200–299)

It means the action was successfully received, understood, and accepted.

3. Redirection messages (300–399)

It means further action must be taken in order to complete the request.


4. Client error responses (400–499)

It means the request contains incorrect syntax or cannot be fulfilled.

5. Server error responses (500–599)

It means the server failed to fulfill an apparently valid request.

4.How can you identify and categorize poor and good


layout/ interface web based
System?
Designing an Interface: good interface

Simplicity: The design of the user interface should be straightforward.

To complete this task, fewer mouse clicks and keystrokes are necessary.

New features should only be added if there is a compelling need for them and they offer significant
value to the application.

Uniformity: There should be more consistency in the user interface.

Online designers can avoid information chaos, uncertainty, and instability by maintaining consistency.

We should adopt a consistent typeface, style, and size convention to all screen components to increase
screen readability and add screen learning. We can use permanent items as unchanging reference points
for the user to travel around.

Intuitiveness is the most important characteristic of a good user interface design.

The term "intuitive user interface design" refers to a user interface that is simple to understand and
operate.

Icons and labels must be clear and straightforward. A clear, unambiguous icon can assist in making the
user interface intuitive, and making labels conform to the vocabulary supported by the program is a
good practice.

A decent user interface design should prohibit users from undertaking in-appropriate tasks by
deactivating or "graying cut" particular parts under certain conditions.

Forgiveness: This feature can encourage users to make full use of the software.

When consumers find themselves in places they shouldn't be, designers should provide a route out.
Graphical User Interface Design: A graphic user interface design creates an operational environment for
the user by displaying screen displays that give an explicit visual and functional context for the user's
actions.

Standard objects like as buttons, icons, text, fields, windows, photos, and pull-down and pop-up screen
menus are included.

Bad design design for an interface

Hard-to-find navigation options. When you are browsing a website or using an app, you spend a lot of
time on finding one navigation option, but finally it is at an insignificant place.
This is a bad interaction design example. An unremarkable option would waste a large number of time
and energy. At last, users would lose their patience. Your navigation options should be set out clearly,
because they can reflect what are the basic functions of your product.
Complicated operations and bad usability. For those who firstly enter your website or use your App, if
your design is too complicated to understand for them, they would definitely dislike your bad
interaction design. Any product would take usability first, because the reason why users choose your
product is that they want to realize their goals and demands easily. Just like UX & interaction designers,
they also need to consider the inherent usability of the interfaces to make the underlying system to
comprehend and use
Inappropriate connection. When you make an easy-to-operate interaction design, which means you
already have half success. However, when it comes real, some interaction designers always ignore some
details. Here is a bad interaction design: Users want to enter the feedback interface, but after they click
the option, the address of company appears. As the old saying goes, details determine success or failure.
Such mistakes wouldshow your unreliability of users
Terribly Arranged Components & Ugly typography. This importance of typography of pages and
components cannot be ignored. Sometimes, we might click two popups to make us view different
information at one time. However, designers would forget to make right typography of these two
popups, thus causing one cover another, or when user clicks a popup, another would disappear. Finally,
users have to exit one popup to open another. Repeating this action would make users uncomfortable.
A good interaction design would simplify users’ operation, not make the opposite.
Illogical interactive ideas. If the interactive thinking can’t satisfy the thinking habits of users, then it
would be a terrible interactive thinking, finally designer would work out a terrible interaction design. An
appropriate interactive thinking should guarantee a fluid design process, and a fluid design process
should be based on an excellent prototyping tool. Excellent prototyping tools should have some
advantages like abundant components, easy operation, and humanization to make sure the designers
have good design thinking and work out great interactive design.

You might also like