Recent research in computer science and technology

College of Engineering announces big data minor You can learn essential technology skills with the new program, launched this fall. College of Engineering announces big data minor Nuclear Robotics Research Featured in Inside Unmanned Systems Research by Kostas Alexis into how drones could be used to explore nuclear waste sites is changing the way authorities approach nuclear decontamination. Read the full story Exploring how to make virtual reality more accessible to women Eelke Folmer is part of a team studying VR sickness, which seems to affect women more than men.

Recent research in computer science and technology

Charles Babbage sometimes referred to as the "father of computing". Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division.

Further, algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in InThomas de Colmar launched the mechanical calculator industry [note 1] when he released his simplified arithmometerwhich was the first calculating machine strong enough and reliable enough to be used daily in an office environment.

Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engineinwhich eventually gave him the idea of the first programmable mechanical calculator, his Analytical Engine.

When the machine was finished, some hailed it as "Babbage's dream come true". Computer science began to be established as a distinct academic discipline in the s and early s. The first computer science degree program in the United States was formed at Purdue University in Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.

Initially, computers were quite costly, and some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage. Contributions[ edit ] The German military used the Enigma machine shown here during World War II for communications they wanted kept secret.

The start of the " Digital Revolution ", which includes the current Information Age and the Internet. It also enabled advanced study of the mind, and mapping of the human genome became possible with the Human Genome Project.

Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligencemachine learningand other statistical and numerical techniques on a large scale.

Even films that feature no explicit CGI are usually "filmed" now on digital camerasor edited or post-processed using a digital video editor. Modern computers enable optimization of such designs as complete aircraft. Notable in electrical and electronic circuit design are SPICE, as well as software for physical realization of new or modified designs.

The latter includes essential design software for integrated circuits. There are many applications of AI, some of which can be seen at home, such as robotic vacuum cleaners. It is also present in video games and on the modern battlefield in drones, anti-missile systems, and squad support robots. Human—computer interaction combines novel algorithms with design strategies that enable rapid human performance, low error rates, ease in learning, and high satisfaction.

Researchers use ethnographic observation and automated data collection to understand user needs, then conduct usability tests to refine designs. Key innovations include the direct manipulationselectable web links, touchscreen designs, mobile applications, and virtual reality.

Because of this, several alternative names have been proposed.Employment of computer and information technology occupations is projected to grow 13 percent from to , faster than the average for all occupations.

Recent research in computer science and technology

Free resources, funding, trainings, and research to help educators foster the next generation of problem solvers and computer scientists. IACSIT is a registered international scientific association of distinguished scholars engaged in Computer Science and Information Technology.

Home | College of Computing

The IACSIT members include research and development center heads, faculty deans, department heads, professors, research scientists, engineers, scholars, experienced software development directors, managers and engineers, university postgraduate and.

Professor Tim Hickey's research focuses on educational technology, brain computer interfaces and game-based learning. Watch video · Listen to the latest podcast from Microsoft Research Deep Learning Indaba Strengthening African machine learning The process of writing efficient computer vision algorithms Uniting fundamental research and industry-defining products Optimizing imperative functions in relational databases with Froid Recent episodes Putting the cloud.

Sep 13,  · Computer science and technology. Introduction to Computer Science using Python from MITx has become the most popular MOOC in MIT history. August 30, New York Times op-ed by MIT president says a national focus on innovation and research is more effective than only playing defense on trade practices.

August 8,

Computer Science News | Technology News | Computer Science Technology | Computer Sciences