Lesson 1 – ICT lesson plan – Fast, repetitive processing
  1. How does ICT achieve fast repetitive processing?
  2. What real world applications are there?
  3. What are the limitations of fast repetitive processing?

http://www.igcseict.info/theory/7_2/manuf/

 

http://www.techrepublic.com/article/chinese-factory-replaces-90-of-humans-with-robots-production-soars/

Science

Supercomputers are used in science to provide fast, repetitive processing for things such as molecular models, tracking climate change and more. The Met Office have a £97 million supercomputer that has impressive computing power that produces high resolution models with the ability to pin-point more detail for small scale and high impact weather. These high resolution models can help predict when the fog will be over airports, for example. The Met Office’s supercomputer performs daily weather forecasts and severe weather warnings so it needs to be very powerful to undertake these tasks.

 

Supercomputers play a huge part within computational science and molecular modelling. Supercomputers are used regularly to produce accurate models of biological macromolecules, polymers, crystals and finding out the properties of compounds using different experimental simulations. Oak Ridge laboratory in the US have made a supercomputer called ‘Titan’. Titan was declared the fastest supercomputer on earth by the department of science. Titan was built solely for science but it has its other uses also. Titan’s peak performance is 20 petaflops, it has just under 300,000 CPU cores and has over 700 terabytes of RAM memory. Titan performs quantum mechanics, complex chemistry, climate change predictions, astrophysics and more on a daily basis. At the moment Titan is working on a project for using biofuel as an alternative to fossil fuels.

http://www.metoffice.gov.uk/news/releases/archive/2014/new-hpc

https://en.wikipedia.org/wiki/Folding@home

Science

Supercomputers are used in science to provide fast, repetitive processing for things such as molecular models, tracking climate change and more. The Met Office have a £97 million supercomputer that has impressive computing power that produces high resolution models with the ability to pin-point more detail for small scale and high impact weather. These high resolution models can help predict when the fog will be over airports, for example. The Met Office’s supercomputer performs daily weather forecasts and severe weather warnings so it needs to be very powerful to undertake these tasks.

 

Supercomputers play a huge part within computational science and molecular modelling. Supercomputers are used regularly to produce accurate models of biological macromolecules, polymers, crystals and finding out the properties of compounds using different experimental simulations. Oak Ridge laboratory in the US have made a supercomputer called ‘Titan’. Titan was declared the fastest supercomputer on earth by the department of science. Titan was built solely for science but it has its other uses also. Titan’s peak performance is 20 petaflops, it has just under 300,000 CPU cores and has over 700 terabytes of RAM memory. Titan performs quantum mechanics, complex chemistry, climate change predictions, astrophysics and more on a daily basis. At the moment Titan is working on a project for using biofuel as an alternative to fossil fuels.

http://www.itworld.com/article/2889072/u-s-army-plans-for-a-100-petaflop-supercomputer.html

Security

PRISM is a clandestine surveillance program (a program carried out so it goes unnoticed by

the general population) under which the NSA collects internet communications from at least

nine major US internet companies to be used for global surveillance. Some of the businesses

included consist of Microsoft, Google, Facebook and Apple. In 2013, US President, Barack

Obama described the program as “A circumscribed, narrow system directed at us being able

to protect our people.”

Media Disclosure

In 2013, evidence of the program was leaked to newspapers, The Guardian and the

Washington Post by an NSA contractor, Edward Snowden. Snowden warned the public that

the extent of mass data collection was far greater than the public knew and included what he

characterized as "dangerous" and "criminal" activities. The leaked documents included 41

PowerPoint slides, 4 of which were published in news articles. The slide presentation stated

that much of the world's electronic communications pass through the US, because electronic

communications data tend to follow the least expensive route rather than the most physically

direct route, this therefore means that with this information, the NSA could look at most of

the global populations data. Even though they probably don’t look at most of this data, they

still have the power to if they want to. Snowden’s documents also included evidence that

other government agencies such as the UK’s GCHQ also undertook mass interception and

tracking of internet and communications data.

Some of the key targets for the PRISM program were:-

 Venezuela

– Military Procurement

– Oil

 Mexico

– Narcotics

– Energy

– Internal Security

– Political Affairs

 Colombia

– Trafficking

– FARC (Revolutionary Armed Forces of Colombia)

https://en.wikipedia.org/wiki/PRISM_(surveillance_program)

http://qz.com/626697/humans-are-replacing-robots-on-mercedes-production-line/

Lesson ideas:

  • Flash card sorting of different topics in to different areas? Fit in to the correct boxes.
  • True/False statements for discussion.