Professional Documents
Culture Documents
DS Mock2 Updated
DS Mock2 Updated
DS Mock2 Updated
The GPS receiver in the watch takes the information signals from at least 3
satellites.
3. The difference between the (atomic) distance between the three satellites is
used to show the routes of his training runs by calculating the 2D dimensions.
•
•
•
•
Reasons for Sharing Personal Health Reasons for Not Sharing Personal
Information Health Information
The university may have access to data analytics There may be no way of knowing which
tools that can interrogate his personal health other third parties the university is sharing
information and give him feedback on his Jaime's data with (values, systems)
health/fitnessPersonalized health analytics feedback. Lack of trust and privacy concerns
The university may be able to provide additional The university may impose conditions that
health-related information may mean the data is not used for the
Suggest additional health improvement patterns purposes it was intended (values)encounter sense of
The university may be able to analyze Jaime's Once the data is shared, it is hard deception
to
data against other and/or larger data sets guarantee that it is deleted when it is no
Compare Jaime's health to global standards longer needed (values)
Jaime may accept that his data is "out there" Patient privacy is a concern. Is data
already, so there is no harm in re-sharing it anonymized, and does the university have
(values) sufficient security measures in place?Lack of security
Jaime may wish to contribute towards research at The sharing of Jaime's data may have
the University of Sierra Nevada (values) unintentional consequences (systems,
Jaime's data contribute to serve the humanity accountability)
In 10 states in the United States, artificial intelligence (AI) software is used for sentencing
criminals. Once criminals are found guilty, judges need to determine the lengths of their prison
sentences. One factor used by judges is the likelihood of the criminal re-offending*.
The AI software uses machine learning to determine how likely it is that a criminal will re-
offend. This result is presented as a percentage; for example, the criminal has a 90 % chance of
re-offending. Research has indicated that AI software is often, but not always, more reliable than
human judges in predicting who is likely to re-offend.
There is general support for identifying people who are unlikely to re-offend, as they do not need
to be sent to prisons that are already overcrowded.
Recently, Eric Loomis was sentenced by the state of Wisconsin using proprietary AI software.
Eric had to answer over 100 questions to provide the AI software with enough information for it
to decide the length of his sentence. When Eric was given a six-year sentence, he appealed and
wanted to see the algorithms that led to this sentence. Eric lost the appeal.
On the other hand, the European Union (EU) has passed a law that allows citizens to challenge
decisions made by algorithms in the criminal justice system.
Structural Mismatches: Varying database structures may result in fields not aligning,
Answers may include: leading to unusable integrated data.
_
Award [1] for identifying a problem the developers of the AI system will encounter when
gathering the data that will be input into the AI system and [1] for a development of that reason
up to a maximum of [2].
5. Outline one problem that may arise if proprietary software rather than
open-source software is used to develop algorithms. [2]
Award [1] for identifying a problem of using proprietary software and [1] for a
development of that problem up to a maximum of [2]._
We see and hear news every day and trust that the information provided is accurate. That belief
may soon end.
Artificial intelligence (AI) software is now being developed that can produce fake video footage
of public figures using recordings of their own voices. Using as little as one minute of user-
generated content (data), it can reproduce a particular person’s voice. The developer of this
software demonstrated the results by using the voices of Bill Clinton, George Bush and Barack
Obama in a computer-generated conversation.
Once a person’s voice has been reproduced, a fake video can be created by processing hundreds
of videos of the person’s face. Video footage of politicians are often used, as there is so much
data available online.
Law professor John Silverman commented that, as humans we tend to believe what we see, and
the increased number of tools to make fake media that is unrecognizable from real media is
going to prove a major challenge in the future.
Discuss the claim that companies who develop software that can create fake videos of politicians
should be accountable for the fake videos posted by users of their software on social media
platforms.
Answers may include:
• Although there may be no legal requirement for the software company to monitor the
videos of users of their software, there may be ethical reasons why they should, and it is
not appropriate to hide behind an end-user agreement when the software has the potential
to develop fake videos (ethics, values, transparency).
• The software company has not been seen to take all avoidable steps to prevent the fake
videos being posted / the software company can be proved to be acting outside the spirit
of the law or maliciously (ethics, values).
• The software company has allowed users to purchase the software in countries where
there may be rigorous censorship laws appreciating that its use may be seen as unlawful
(ethics, values).
• If the software company has positioned itself as a responsible developer, and their policy
documentation explicitly shows that they are practising what they preach (values, ethics)
and have acted responsibly by minimizing the potential harm that may be caused.
• It is unrealistic and unenforceable, however well intentioned, for the software company to
be accountable for the content of the videos.
• If the end-user agreement stated explicitly that the user would be accountable, would that
clause be enforceable by the software company?
• It is hard to determine at what point the software company would be accountable, as the
software itself does not have the capability to cause harm, it is the user whodoes so
(values).
• At what point is a video considered fake? Is a spoof video a fake video (media
authenticity)?
• What happened to free speech or freedom of expression (values, ethics)?
In this question it is expected there will be a balance between the terminology related to digital
systems and the terminology related to social and ethical impacts.
Keywords: politics, political speech, lobbying, machine learning, speech recognition, image
analysis, deepfakes, media authenticity, synthetic digital media, monitoring, accountability,
responsibility, change, expression, power, values, ethics
Refer to HL paper 1 Section B markbands when awarding marks. These can be found under the
'Your tests' tab > supplemental materials > Digital society markbands and guidance document.
Q.2
In the near future, it is possible that cash will not be accepted as a means of payment in Sweden.
People are already using alternative ways of paying, such as mobile payment, card payment and
internet payment. Currently, over 95% of citizens in Sweden have internet access.
Many people in Sweden claim there are advantages of using an app developed by Swish. The
Swish app allows friends to share a restaurant bill, pay where credit or debit cards are not
accepted, for babysitting or parking tickets, or make a donation at church.
However, other people in Sweden claim that making the Swish app the only means of payment
may increase inequalities within the country.
Discuss whether countries should pass legislation making apps such as Swish the only means of
payment.
Benefits of apps such as Swish being the only form of payment (claim)
• No need to carry cash (digitalization) / credit cards – no risk of being stolen/no risk of not
having enough cash. Availability of payment with no cash or card
• No risk of credit card being used fraudulently (values). Fraud protection
• Transactions are recorded – there is proof of payment, and the system should increase
transparency / acceptability. direct proof of payment
• Payments are made immediately – no need to wait until person has time to go to bank to
get cash (feasibility). Immediate recorded transactions; no need to visit bank to withdraw.
• Can solve other problems regarding money – bills can be shared, one person pays, and
money is transferred (feasibility). Flexible bill sharing between friends.
• In an emergency, money can be transferred to dependents without them being close (e.g.,
children at university) (feasibility). Emergency money transfer.
• Allows money transactions between individuals. Ease of Peer-to-peer transactions.
• Can limit the amount of money to be transferred, thus preventing individuals spending
more than they have available (acceptability / feasibility). Spending limit control.
• Many people are already used to the app, so it would be a good choice if the country were
going cashless (acceptability). User acceptance
• Easier for users to track budgets / spending, as all transactions are digital (transparency).
Easier Budget tracking
Disadvantages of apps such as Swish being the only form of payment (counter-claim)
• Swedish banks will be able to obtain more data on their users’ transaction habits (privacy
concern, ethics, values). bank surveillance; privacy concern
• Is not available to people who do not have a bank account, so a potential digital divide
concern (equity). Digital divide; not available to people with no bank account (elderly)
• May be problematic for tourists who may not have the app or who cannot link the
purchase to their bank account (systems, equity). Inconvenient for tourists.
• Removes the anonymity of the payee in transactions – the app may store a user’s
transaction history. This would include date, item, recipient of the money and cost of the
item (values, ethics). no anonymity
• The bank controls (power) the maximum amount of money that can be transferred, which
may limit a person’s spending and may not be appropriate in certain situations
(acceptability). Bank-controlled spending limits
• It may not be technically possible to make the transition from a society that uses cash for
transactions to one that does not (change, systems). Different society transition challenges
• If a single app is used, it would give Swish an unfair monopoly over the technology
(power, values, ethics). Monopoly concerns; unfairness.
• Digital divide – smartphone ownership and use by mature adults.
• If a person loses their phone, breaks it, or its battery runs out, they have no way to pay for
anything (systems, equity). Dependency on smartphones and newtworks.
• Failure in / lack of phone network coverage could affect when and where people could
use the app (systems, equity).
• Failure of the system / technical issues / down time would prevent people from making
transactions (systems, feasibility). Technical issues
Accept implicit and explicit references to apps such as the Swish app.
In this question it is expected there will be a balance between the terminology related to digital
systems and the terminology related to social and ethical impacts.
Keywords: business, families, digital divide, access, inclusion, acceptability, feasibility, equity,
digitalization, anonymity, privacy, change, power, systems, values, ethics
Refer to HL paper 1 Section B markbands when awarding marks. These can be found under the
'Your tests' tab > supplemental materials > Digital society markbands and guidance document.
pic 1
pic 3
pic 2
How technology companies use influencers to promote their products [2]
Answers may include:
• Providing the influencer with the product for free, so that the influencer agrees to post
about the product on social media sites.
• Tech companies use social media data to find influencers who will target their market.
That way, the influencer’s message is focused on their specific market.
_
Award [1] for way how technology companies use influencers to promote their products and [1]
for a development of that reason up to [2] marks._
• It is more cost effective than target advertising, as the company can select the tech
influencer by their follower demographic.
• Influencers are viewed as experts, so followers will trust endorsements and product
mentions.
• Influencer content may be a personal narrative, which helps differentiate posts from the
type of features- or sales-driven ones a brand might do for the same product on their own
feed.
_
Award [1] for one reason why technology companies work with influencers to promote their
products and [1] for the development of that reason up to [2] marks._
_
Award [1] for each relevant trend up to [2] marks. _
Agency: (controllers)
• Source D encourages input from family members for setting rules; likewise, source C
provides data for family discussions and allows anyone to enforce screen-free time. In
both cases, each family member has agency in the solution.
• Source C requires parents to configure the app, and this may not be done correctly so less
effective; likewise, the rules created by families (Source D) may not be complete/have
gaps/not deal with all eventualities and therefore be less effective.
Monitoring: (observation)
• Source C provides evidence of usage, which could lead the discussions on responsible
use and rule setting, whereas Source D relies on self-reporting for the discussion and
negotiation of rules only with no hard data to determine the effectiveness of the rules.
• Setting/negotiating limits, goals (Source C and D).Outlining rules to be followed in order to reduce negative usage.
• Following through with consequences, responsibility. Intensive follow-up
• Modelling good practice by moderating their usage in front of their children (Source A
and B). A good role model
• Use of technology-based solutions as noted in Source C.
• Setting boundaries and promoting non-technology activities, e.g., screen-free
evenings/holidays, charging overnight in a common space (Source C and D).
• Controlling access through data plans and encouraging children to pay for their own data
after the limit has been reached (responsibility, accountability).
• Watching for warning signs, including spending too much time alone, not getting enough
sleep, worse physical health, and not taking part in healthy activities (mental health).
• Not all parents are aware of safety issues and need to be educated themselves (education).
Governments could play a role with campaigns/laws to raise awareness targeted at
parents (Source A). Special awareness campaigns for parents
• National campaigns targeted at children for alternatives to phone usage/social media
(Source A and B).
Enforce serious laws
• Changes in laws, such as increase in age for access to social media sites (e.g., COPPA).
Children's Online
• Healthcare funding for research into and support for addiction/mental health. Privacy Protection Act
Providing financial assistance or support
• Accounts may need to have more rigorous age-verification methods before usage.
• Activate time limits settings in apps such as Facebook and Instagram.
• Mobile devices can track/monitor usage such as screen time, apps used.
• Analyse data collected to provide insights that may be used to improve/promote safety
and healthy habits (digital literacy).
Marking notes: It is not necessary to explicitly refer to each source to achieve the highest mark
band, but there must be an explicit reference to at least two sources. To achieve the highest
markband the sources must be synthesized in an integrated manner rather than a systematic
analysis of each individual source.
• Increase in number of friends on social media due to social media being more widely
available.
• Increase in friends in different countries/time zones due to increased coverage worldwide
of social media sites.
• Increase in friends in different countries/cultures due to the use of translation tools for
communication in multiple languages.
• Increase in access to mobile technology due to the decrease in cost of phones/data.
• Increase in use of asynchronous communication methods as asynchronous
communication has become more socially acceptable.
• More people to communicate with due to the increase in number of people in 2018 with
access to mobile technology/social media.
• Increase in number of ways to communicate in addition to in-person, which allows
people who were previously unable to communicate to hold discussions.
_
Award [1] for identifying each reason why there has been a significant decrease in in-person
communication with teens from 2012 to 2018 and [1] for a development of that reason up to [2]
marks. Award a maximum [4] marks. _
Background information not needed.
autonomous vehicles
Simply uber
Autonomous Vehicle Readiness Index (AVRI): A measure assessing a country's readiness and preparedness for
autonomous vehicles.
Ethical decision-making model (e.g., Markkula): A framework guiding ethical decision-making to align with pre-identified
principles.
Society of Automotive Engineers (SAE) Scale: Classifies levels of vehicle automation features.
Transport network company (TNC): Provides transportation services via online platforms or apps typically offering
ridesharing.
(With reference to the proposed interventions. information included in the sources and your
own inquiries, recommend a digital intervention that would most effectively address the
challenge of access to healthcare and medicine in remote and rural communities.) [12 marks]
1. Equity - does the intervention address the needs, claims, and intersects of those affected
by the challenge?
3. Cost - What are the financial, social, cultural, and environmental costs associated with
the intervention?
4. Feasibility - Is the intervention technically, socially, and politically feasible? What are
some of the barriers?
6. Ethics - Is the intervention ethically sound, and who determines the ethical status of the
intervention?
7. Recommendation -
مافي