Semantic Digital Libraries Evaluation: Results of Lukasz Porwol (Jeromedl)

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

Semantic Digital Libraries Evaluation


Results of Lukasz Porwol (JeromeDL)

Pre-Evaluation Questionnaire

1. Your age

< 20 21 - 25 26 - 30 31 - 35 36 - 40 > 40

2. Your education

under- BSc MSc PhD Other


grad

3. Main domain of education / research

Informatics / Computer Science


Electronics / Telecommunications
Business / Marketing
Fine Arts / Architecture
Life Science / Social Science
Bio-Chemistry
Maths / Physics
Other

4. How often do you visit the university library? (slide the X marks on the lines below)

at least at least at least


more
once a once a once a
seldom
day week month

http://q.digime.name/quest/complete?_u=Songo Page 1 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

5. How often do you visit the university digital library? (slide the X marks on the lines below)

at least at least at least


more
once a once a once a
seldom
day week month

6. Do you know what social tagging is?

Yes
No

7. Do you use any online bookmarking system? (If yes - name up to 3 most often used.)

Yes
No

8. Are you subscribed to any online social networking site? (If yes - name up to 3 most often
used.)

Yes
No

9. Have you ever come across the term “Web 2.0”? Do you know what it means?

Yes
No

10. Have you ever come across the term “Semantic Web”? Do you know what it means?

Yes
No

Questionnaire After the Initial Task

http://q.digime.name/quest/complete?_u=Songo Page 2 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

1. What is your general first impression about the system? (slide the X marks on the lines
below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

2. How did you find the task of registering to the digital library? (slide the X marks on the
lines below)

hard to understand easy to understand

hard to execute easy to execute

hard to master, intuitive, straight


unintuitive forward

3. How did you find the task of finding publications by students of Stefan Decker? (slide the
X marks on the lines below)

hard to understand easy to understand

hard to execute easy to execute

hard to master, intuitive, straight


unintuitive forward

http://q.digime.name/quest/complete?_u=Songo Page 3 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

4. How did you find the task of finding ''publications about notitio.us'' ? (slide the X marks
on the lines below)

hard to understand easy to understand

hard to execute easy to execute

hard to master, intuitive, straight


unintuitive forward

5. How did you find the task of bookmarking articles about notitio.us ? (slide the X marks on
the lines below)

hard to understand easy to understand

hard to execute easy to execute

hard to master, intuitive, straight


unintuitive forward

6. Please describe how (using which system) did you bookmark articles about notitio.us.
(Please, provide a link to the bookmarking services if you used an online bookmarking
solution, outside of the digital library.)

I used notitio.us at first, than when was searching the resources


and when I have found article about notitio.us I have noticed that
there is embedded bookmarking system that is very usefull and I

7. Do you have any comments to this part of the evaluation?

Interface should be more intuitive and clear ...without overloaded


information bars on the site you are watching..more graphical Icons
instead of text

http://q.digime.name/quest/complete?_u=Songo Page 4 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

First Task

Question: What is the (autopresentative) meaning of using information technologies (IT) in


education?
(PL: Jakie znaczenie autoprezentacyjne ma korzystanie z technologii IT w dzialaniach
edukacyjnych?)

Answer:

Information technologies enables people to improve learning process thanks to fast


information searching and access. Great impact on education improvoment has many kind
of multimedia such as *.ppt presenations or just movies presenting essential informations.
Thanks to IT now we can provide more custom paths of education. It also provides simple
ways to present our achievements

References:

http://jeromedl.digime.name/resource/dO40R5Ox

Comment:

It was very hard to me to find any information related with this question...Every search
returns nothing for word "autopresentative" and "autopresentation"I

Evaluation Questionnaire after the First Task

1. How did you find the task of finding information in the digital library? (slide the X marks
on the lines below)

hard to understand easy to understand

hard to execute easy to execute

hard to master, intuitive, straight


unintuitive forward

2. How do you find search feature in the digital library? (slide the X marks on the lines
below)

hard to use easy to use

http://q.digime.name/quest/complete?_u=Songo Page 5 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

3. How do you find natural language query in JeromeDL? (slide the X marks on the lines
below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

4. How do you find TagsTreeMaps filtering in JeromeDL? (slide the X marks on the lines
below)

hard to use easy to use

complex, mind boggling simple, clearly organized

http://q.digime.name/quest/complete?_u=Songo Page 6 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

5. How do you find Exibit filtering component in JeromeDL? (slide the X marks on the lines
below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

6. How do you find MBB browsing in JeromeDL? (slide the X marks on the lines below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

http://q.digime.name/quest/complete?_u=Songo Page 7 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

ugly, unattractive attractive

useless useful, handy

7. How do you find SQE searching in JeromeDL? (slide the X marks on the lines below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

8. Which features of JeromeDL did you like best? (select 3 at most)

Simple Search
Advanced Search
Natural Language Query Search
Filtering with TagsTreeMaps
Filtering with Exhibit
Browsing with MultiBeeBrowse
Semantic Query Expansion

9. Which features of JeromeDL did you hate most? (select 3 at most)

Simple Search
Advanced Search

http://q.digime.name/quest/complete?_u=Songo Page 8 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

Advanced Search
Natural Language Query Search
Filtering with TagsTreeMaps
Filtering with Exhibit
Browsing with MultiBeeBrowse
Semantic Query Expansion

10. Do you have any comments to this part of the evaluation?

Evaluation Questionnaire after the Networking Task

1. How did you find the task of networking with your friends in the digital library? (slide the
X marks on the lines below)

hard to understand easy to understand

hard to execute easy to execute

hard to master, intuitive, straight


unintuitive forward

2. How do you find the social networking component in JeromeDL? (slide the X marks on
the lines below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

http://q.digime.name/quest/complete?_u=Songo Page 9 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

ugly, unattractive attractive

useless useful, handy

3. Do you have any comments to this part of the evaluation?

The social networking should be more interactive, more user


friendly, more (and intuitive) graphical icons. The social
functionalities should be provided (make social networking more
social)

Second Task

Question: What (credibility) dimension are used to evaluate the credibility of the
information on the Internet (used in the research)?
(PL: Jakie skale/kryteria uzywane sa do oceny wiarygodnosci informacji Internetowej
(wykorzystywane w badaniach)?)

Answer:

At first to evaluate the credibility of the iformation on the interent the same dimensions as for
media credibility were being used in the research and it was: believability, accuracy,
trustworthiness, bias, and completeness of information. Then five another types of
verification strategies were used. It was accuracy, authority, objectivity, currency, and
coverage. Because perceived credibility may vary depending upon the type of information
sought and retrieved, the credibility of four types of information was assessed: news or
current events information, entertainment information, reference (factual) information, and
commercial or product information.

References:

http://jeromedl.digime.name/resource/RKyRXPaE

Comment:

This psychological test was much easier to me than before because i have found many
articles for th keywords "evaluate information"

Evaluation Questionnaire after the Second Task


http://q.digime.name/quest/complete?_u=Songo Page 10 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

1. How did you find the task of finding information in the digital library? (slide the X marks
on the lines below)

hard to understand easy to understand

hard to execute easy to execute

hard to master, intuitive, straight


unintuitive forward

2. How do you find search feature in the digital library? (slide the X marks on the lines
below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

3. How do you find bookmarks sharing in JeromeDL? (slide the X marks on the lines below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

http://q.digime.name/quest/complete?_u=Songo Page 11 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

boring interesting

ugly, unattractive attractive

useless useful, handy

4. How do you find collaborative browsing in JeromeDL? (slide the X marks on the lines
below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

5. How do you find annotating resources in JeromeDL? (slide the X marks on the lines
below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

http://q.digime.name/quest/complete?_u=Songo Page 12 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

ugly, unattractive attractive

useless useful, handy

6. How do you find ranking resources in JeromeDL? (slide the X marks on the lines below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

7. Do you have any comments to this part of the evaluation?

When infomation searching process not iritate you, than browsing


through digital library is a pleasure.

Third Task

Question: Which factors related to psychological cognitive processes influence the


growth of the credibility of information from the Internet?
(PL: Jakie czynniki zwiazane z psychologicznymi procesami poznawczymi wplywaja na wzrost
oceny wiarygodnosci informacji internetowej?)

Answer:

To evaluate influence on credibility of the information from the internet we can use the same

http://q.digime.name/quest/complete?_u=Songo Page 13 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

factors like in the other interpersonal media. We can mark out here several groups of
factors like: Source, Receiver, Message, Medium and Context. For the Source group we
have factors like: Expertise/Knowledge, Trustworthiness, Credentials, Attractivness,
Simmilarity to receiver beliefs context, Likeability/Goodwill/Dynamism. Then for the Receiver
group we have factors like: Issue relevance, Motivation, Prior knowledge of the issue, Issue
involvement, Values/beliefs/situation, Stereotypes about source or topic, Social location. For
the Message group we have factors like: Topic/content, Internal validity/consistency,
Plausibility of arguments, Supported by data or examples, Farng, Repetition, familiarity,
Ordering. For the Medium group we have factors like: Organization, Usability, Presentation,
Vividness. And finally for the Context group we have factors like:Distraction/"noise", Time
since message encountered. If we want our credibility of information to grow we have to
make some improvoments in presented areas to make presented factors better. For making
Internet information credibility grow we can also respond for the Internet user verification
strategy which we can describe using questions: Check if the information is currnt? Check to
see that the information is complete an comprenhensive? Consider whether the views
repsesented are opinionts or facts? Seek out other sources to validate the informatio?
Consider the author's goal/objectives for posting infroamtion online? Chcck to see who the
author of the web site is? Look for n official "stamp of approval" or recommendation for
somone you know? Check to see whether the contact information for the person or
organization is provided on the sit? Verify the author's qualifications or credentials? If we fill
in this expectations we will make our credibility grow. In addition, Internet information to be
credible should serve as guides for the formation of appropriate Internet policy.

References:

http://jeromedl.digime.name/resource/34S3YdYb
http://jeromedl.digime.name/resource/E37g3Und
http://jeromedl.digime.name/resource/RKyRXPaE

Comment:

This task was pretty easy to me. I have found plenty of articles with searched subject. The
first article I checked was very rich (and has enough informations) I really didn't have to
search the unswer through any other. The thing that disturb me is some problems with text
input into textarea in the middle of the sentence(letters are overwritten sometimes).

Evaluation Questionnaire after the Third Task

1. How did you find the task of finding information in the digital library? (slide the X marks
on the lines below)

hard to understand easy to understand

hard to execute easy to execute

hard to master, intuitive, straight


unintuitive forward

http://q.digime.name/quest/complete?_u=Songo Page 14 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

2. How do you find search feature in the digital library? (slide the X marks on the lines
below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

3. How do you find recommendations in resource view in JeromeDL? (slide the X marks on
the lines below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

http://q.digime.name/quest/complete?_u=Songo Page 15 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

4. How do you find recommendations in bookmarks in JeromeDL? (slide the X marks on the
lines below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

5. Do you have any comments to this part of the evaluation?

This stage was pretty nice to me. I have found plenty of articles and
many information for searched subject. The interface works fine but
should be more userfriendly and more beautiful. I had some
technical problems with text input into textarea whenever I wanted
to insert text in the middle of the sentence or text block the letters
were overwritten sometimes.

Post-Evaluation Questionnaire

1. What is your overall impression about the DL library you used? (slide the X marks on the
lines below)

hard to use easy to use

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

http://q.digime.name/quest/complete?_u=Songo Page 16 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

boring interesting

ugly, unattractive attractive

useless useful, handy

2. Which JeromeDL features did you like best? (select 3 at most)

Simple Search
Advanced Search
Natural Language Query Search
Filtering with TagsTreeMaps
Filtering with Exhibit
Browsing with MultiBeeBrowse
Semantic Query Expansion
Collaborative Browsing
Bookmarking Resources
Bookmarks Sharing (Collaborative Filtering)
Annotatin/Blogging Resources
Ranking Resources
Bookmarks Recommendations
Resource Recommendations

3. Which JeromeDL features did you like least? (select 3 at most)

Simple Search
Advanced Search
Natural Language Query Search
Filtering with TagsTreeMaps
Filtering with Exhibit
Browsing with MultiBeeBrowse
Semantic Query Expansion
Collaborative Browsing
Bookmarking Resources

http://q.digime.name/quest/complete?_u=Songo Page 17 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

Bookmarking Resources
Bookmarks Sharing (Collaborative Filtering)
Annotatin/Blogging Resources
Ranking Resources
Bookmarks Recommendations
Resource Recommendations

4. Which JeromeDL features did you find most useful? (select 3 at most)

Simple Search
Advanced Search
Natural Language Query Search
Filtering with TagsTreeMaps
Filtering with Exhibit
Browsing with MultiBeeBrowse
Semantic Query Expansion
Collaborative Browsing
Bookmarking Resources
Bookmarks Sharing (Collaborative Filtering)
Annotatin/Blogging Resources
Ranking Resources
Bookmarks Recommendations
Resource Recommendations

5. Which JeromeDL features did you find least useful? (select 3 at most)

Simple Search
Advanced Search
Natural Language Query Search
Filtering with TagsTreeMaps
Filtering with Exhibit
Browsing with MultiBeeBrowse
Semantic Query Expansion
Collaborative Browsing
Bookmarking Resources
Bookmarks Sharing (Collaborative Filtering)
Annotatin/Blogging Resources

http://q.digime.name/quest/complete?_u=Songo Page 18 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

Annotatin/Blogging Resources
Ranking Resources
Bookmarks Recommendations
Resource Recommendations

6. “I think, I would perform better if only ...” (complete the sentence)

If interface will be more userfriendly, intuitive and search process


will be more effective

7. What is your general opinion on semantic services in JeromeDL (TTM, MBB, Exhibit, SQE,
Dynamic Collections, NLQ, recommendations) ? (slide the X marks on the lines below)

hard to use easy to use

hard to understand easy to understand

hard to execute easy to execute

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

8. What is your opinion on social services in JeromeDL (bookmarks, blog-comments,


ranking, collaborative browsing) ? (slide the X marks on the lines below)

hard to use easy to use

hard to understand easy to understand

http://q.digime.name/quest/complete?_u=Songo Page 19 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

hard to execute easy to execute

complex, mind boggling simple, clearly organized

hard to master, intuitive, straight


unintuitive forward

boring interesting

ugly, unattractive attractive

useless useful, handy

9. What should be added to JeromeDL?

Some new, userfriendly interfaces should be added, especially new


searching interface. Some resources are saved as an image (even
it is still a plain text) they should be parsed somehow to enable
easy copy the text not to rewrite it what is a time wasting.

10. What could be improved in JeromeDL?

There should be more integrated and more clear userfriendly,


beautiful interface with more graphics. The search process should
be much more effective and enable people to interact with the
process.

11. Would you like to continue using this digital library (JeromeDL)? (Say why)

Yes

true
No

false

http://q.digime.name/quest/complete?_u=Songo Page 20 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 18/02/2008 19:33

12. Do you have any comments to this part of the evaluation?

Evaluation is well done but it should be more concentrated on


evaluation of each JeromeDL module, instead of whole JeromeDL
evaluation to gain more precise information about them.

[Main Page]

http://q.digime.name/quest/complete?_u=Songo Page 21 of 21
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 02/03/2008 14:37

Semantic Digital Libraries Evaluation


Results of Lukasz Porwol (JeromeDL)

Memory Task

Question: What (credibility) dimension are used to evaluate the credibility of the
information on the Internet (used in the research)?
(PL: Jakie skale/kryteria uzywane sa do oceny wiarygodnosci informacji Internetowej
(wykorzystywane w badaniach)?)
Answer:

The dimension used to evaluate the credibility was especially completeness ,believability,
accuracy, trustworthiness and bias. This fission was performed by Flanagin and Metzger in
2000. Also there were some other dimenssionslike : authority, objectivity, currency, and
coverage.

References:

Comment:

I still remember main subjects from the text I've read some time ago but how can i
remember exact reference ? This stage of evaluation is not free of cheating. If I want to I
could search through documents I have download from JeromeDL (search through date)
and don't need to have acces to JeromeDL to find an full answer to this question. Having 45
min I could also make a new JeromeDL account and use it to find informations or use any
other internet source. Of course we belive that user is honest....but still it can have some
influence on evaluation results.

Evaluation Questionnaire after the Memory Task

1. How did you find the task of recalling previously learned information? (slide the X marks
on the lines below)

hard to understand easy to understand

hard to execute easy to execute

hard to master, intuitive, straight


unintuitive forward

http://q.digime.name/quest/complete?ismemory=true&_u=Songo Page 1 of 3
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 02/03/2008 14:37

2. Which JeromeDL features did you like best? (select 3 at most)

Simple Search
Advanced Search
Natural Language Query Search
Filtering with TagsTreeMaps
Filtering with Exhibit
Browsing with MultiBeeBrowse
Semantic Query Expansion
Collaborative Browsing
Bookmarking Resources
Bookmarks Sharing (Collaborative Filtering)
Annotatin/Blogging Resources
Ranking Resources
Bookmarks Recommendations
Resource Recommendations

3. Which JeromeDL features did you like least? (select 3 at most)

Simple Search
Advanced Search
Natural Language Query Search
Filtering with TagsTreeMaps
Filtering with Exhibit
Browsing with MultiBeeBrowse
Semantic Query Expansion
Collaborative Browsing
Bookmarking Resources
Bookmarks Sharing (Collaborative Filtering)
Annotatin/Blogging Resources
Ranking Resources
Bookmarks Recommendations
Resource Recommendations

http://q.digime.name/quest/complete?ismemory=true&_u=Songo Page 2 of 3
Semantic Digital Libraries Evaluation - Results (Lukasz Porwol/JeromeDL) 02/03/2008 14:37

4. “I think, I would perform better if only ...” (complete the sentence)

I had better, clear and userfriendly interface and more effective


search engines.

5. Do you have any comments to this part of the evaluation?

This part was good to check how big part of informations we learn
from the internet(using digital library) will be still in our heads after
some perioid of time. This part is not free from cheating so it can
have some influance on overall result

[Main Page]

http://q.digime.name/quest/complete?ismemory=true&_u=Songo Page 3 of 3

You might also like