1.“The article discusses how security vulnerabilities referred to as Meltdown and Spectre in computer operating systems could affect personal computers. Information is provided on how kernel memory is at risk, potentially allowing hackers to access passwords, cryptographic keys, or emails. The article goes on to provide details on security updates and patches users can download to protect their computers. Meltdown and Spectre exploit critical vulnerabilities in modern processors. These hardware vulnerabilities allow programs to steal data which is currently processed on the computer. While programs are typically not permitted to read data from other programs, a malicious program can exploit Meltdown and Spectre to get hold of secrets stored in the memory of other running programs. This might include your passwords stored in a password manager or browser, your personal photos, emails, instant messages and even business-critical documents. Meltdown and Spectre work on personal computers, mobile devices, and in the cloud. Depending on the cloud provider's infrastructure, it might be possible to steal data from other customers. In addition, Meltdown breaks the most fundamental isolation between user applications and the operating system. This attack allows a program to access the memory, and thus also the secrets, of other programs and the operating system.”
2.A Graphical User Interface (GUI ) allows users to interact with the computer hardware in a user-friendly way. These GUI’s have been developed over the years of range for different operating systems such as OS/2, Windowsamiga, Linux, Symbian OS, Macintosh, and more.
Considering a look at the evolution of the interface designs of the major operating systems since the 80’s.
XEROX ALTO: The 1st personal computer which used a modern graphical user interface was the Xerox Alto, which was developed in 1973. It was not a commercial product and was intended mainly for research at universities.
Xerox 8010 Star .This was the 1st personal operating system that was referred to as a fully integrated desktop computer including applications and a GUI.
It was known as -The Xerox Star, later renamed as the -ViewPoint” and which again later renamed as the -GlobalView.
This was released in the year 1981.
“The Alto became well known in Silicon Valley and its GUI was increasingly seen as the future of computing. In 1979, Steve Jobs arranged a visit to Xerox PARC, in which Apple Computer personnel would receive a demonstration of the technology from Xerox in exchange for Xerox being able to purchase stock options in Apple. After two visits to see the Alto, Apple engineers used the concepts to introduce the Apple Lisa and Macintosh systems.
Xerox eventually commercialized a heavily modified version of the Alto concepts as the Xerox Star, first introduced in 1981. A complete office system including several workstations, storage, and a laser printer cost as much as $100,000, and like the Alto, the Star had a little direct impact on the market.”
3.“Under intense environmental pressure, the global energy sector is promoting the integration of renewable energy into interconnected energy systems. The demand-side management (DSM) of energy systems has drawn considerable industrial and academic attention in attempts to form new flexibilities to respond to variations in renewable energy inputs to the system. However, many DSM concepts are still in the experimental demonstration phase. One of the obstacles to DSM usage is that the current information infrastructure was mainly designed for centralized systems, and does not meet DSM requirements. To overcome this barrier, this paper proposes a novel information infrastructure named the Internet of Energy Things (IoET) in order to make DSM practicable by basing it on the latest wireless communication technology: the low-power wide-area network (LPWAN). The primary advantage of LPWAN over general packet radio service (GPRS) and area Internet of Things (IoT) is its wide-area coverage, which comes with minimum power consumption and maintenance costs. Against this background, this paper briefly reviews the representative LPWAN technologies of the narrow-band Internet of Things (NB-IoT) and Long Range (LoRa) technology and compares them with GPRS and area IoT technology. LPWANs operate at a lower cost with greater power efficiency than traditional mobile networks. They are also able to support a greater number of connected devices over a larger area.
LPWANs can accommodate packet sizes from 10 to 1,000 bytes at uplink speeds up to 200 Kbps. LPWAN's long-range varies from 2 km to 1,000 km, depending on the technology.
Most LPWANs have a star topology where, similar to Wi-Fi, each endpoint connects directly to common central access points. ”
4.“For more than a century the classic circuit-switched telephony in the form of PSTN (Public Service Telephone Network) has dominated the world of phone communications. The alternative solution of VoIP (Voice over Internet Protocol) or Internet telephony has increased dramatically its share over the years though. Originally started among computer enthusiasts, nowadays it has become a huge research area in both the academic community as well as the industry.
Therefore, many VoIP technologies have emerged in order to offer telephony services. However, the performance of these VoIP technologies is a key issue for the sound quality that the end-users receive. When making reference to sound quality PSTN still stands as the benchmark. The steps and principals involved in originating VoIP telephone calls are similar to traditional digital telephony and involve signaling, channel setup, digitization of the analog voice signals, and encoding. Instead of being transmitted over a circuit-switched network, the digital information is packetized, and transmission occurs as IP packets over a packet-switched network.
They transport media streams using special media delivery protocols that encode audio and video with audio codecs, and video codecs. Various codecs exist that optimize the media stream based on application requirements and network bandwidth; some implementations rely on narrowband and compressed speech, while others support high-fidelity stereo codecs.”
5.The article reports that educators and students are keeping up with blockchain technology's development to prepare for its future. Blockchain's establishing and maintaining trustworthy digital identities that make it so different from the traditional systems of oversight that exist today. By allowing digital information to be distributed but not copied, blockchain technology created the backbone of a new type of internet. Originally devised for the digital currency, Bitcoin, the tech community is now finding other potential uses for the technology.
Bitcoin has been called “digital gold,” and for a good reason. To date, the total value of the currency is close to $9 billion US. And blockchains can make other types of digital value. Like the internet (or your car), you don’t need to know how the blockchain works to use it. However, having a basic knowledge of this new technology shows why it’s considered revolutionary.
Information held on a blockchain exists as a shared — and continually reconciled — database. This is a way of using the network that has obvious benefits. The blockchain database isn’t stored in any single location, meaning the records it keeps are truly public and easily verifiable. No centralized version of this information exists for a hacker to corrupt. Hosted by millions of computers simultaneously, its data is accessible to anyone on the internet.”
6.The Internet is the most extensive publicly accessible network in the world. Connecting to this network is done through service providers called Internet Service Providers, or ISPs. Providers typically provide multiple tiered connection schemes which give a user the ability to take in and send out more information at the same time. This method can be seen just as the current practice of obtaining water from the utility company. The flow information, or data, can be seen as the flowing water to a home. From that point, the amount of water coming into a house can be increased by purchasing a larger inlet pipe from the provider. This is the same concept with Internet service providers. Once the water reaches home, it can be distributed and used around the home as much as needed up to the max flow that the pipes allow. This same concept is the same with the Internet. Information flow came into the home and distributed to devices until the “pipe” is full. Internet service providers now want to not only charge for the pipe coming into the home but also split up the types information within the flow into tiered charges as well. This is where the idea of net neutrality comes into play. Net neutrality states that the flow from the pipe is a stream of data just like the water is from the water company and should not be filtered to charge for certain types of data coming through. The Internet is a flow of information and should not be restricted based on the kinds of information coming through the pipes.
7.“McKinsey & Company has found that about 30% of tasks in 60% of occupations will be computerized in coming years. The Bank of England’s chief economist said that 80m US and 15m UK jobs will be taken over by robots in the very near future.
Telemarketing has a 99% probability of automation according to The Future of Employment report; where robots or say automated IVR systems will be doing robocalls like for insurance or banking loan sales calls, which now humans do as telesales team. Even Tax preparation involves processing large amounts of predictable big data, has a 99% chance of being automated due to the advanced Information Communication Technology in the field of Analytics.
Also, Robots are going to take over the backend tasks in professions such as law, the paralegals and legal assistants face a 94% probability of automated systems or bots doing their work. Deloitte says, more than 100,000 jobs in the legal sector will be automated in the next 20 years. Our world increasingly depends on elaborate networks: electric power grids, air traffic control, international finance, globally dispersed manufacturing and so forth. Unless these networks are highly resilient, their benefits could be outweighed by catastrophic (albeit rare) breakdowns real-world analogs of what happened in 2008 to the financial system. Our cities would be paralyzed without electricity. Supermarket shelves would be empty within days if supply chains were disrupted. Air travel can spread a pandemic worldwide within days, causing the gravest havoc in the shambolic megacities of the developing world. And social media can spread panic and rumor and economic contagion literally at the speed of light.”
8.“While Machine learning can be highly theoretical, this book offers a refreshing hands-on approach without losing sight of the underlying principles. The resource covers the leading data science languages, Python and R, and the underrated but powerful Julia, as well as a range of big data platforms including Spark, Hadoop, and Mahout. Practical Machine Learning is an essential resource for modern data scientists who want to get to grips with Machine learning's real-world application. The book also explores cutting-edge advances in Machine learning, with worked examples and guidance on Deep learning and Reinforcement learning, providing you with practical demonstrations and samples that help take the theory–and mystery–out of even the most advanced Machine learning methodologies. Machine learning is a method of data analysis that automates analytical model building. It is a branch of artificial intelligence based on the idea that systems can learn from data, identify patterns and make decisions with minimal human intervention. Because of new computing technologies, machine learning today is not like machine learning of the past. It was born from pattern recognition and the theory that computers can learn without being programmed to perform specific tasks; researchers interested in artificial intelligence wanted to see if computers could learn from data. The iterative aspect of machine learning is important because as models are exposed to new data, they are able to independently adapt. They learn from previous computations to produce reliable, repeatable decisions and results. It’s a science that’s not new – but one that has gained fresh momentum.”
9.“Due to resource constraints, most of the Android application developers need to address potential performance problems during application development and maintenance. The coding styles and patterns of Android programming could often affect the execution time and energy efficiency which are utilized by the Android applications. Thus, it is necessary for application developers to apply performance-enhancing programming practices for mobile application development. This paper introduces performance-enhancing best practices for Android programming, and further, it evaluates the impact of these practices on the CPU time of the application. The original version with the performance-worsening code has been refactored to become an efficient version without changing its functionality. Android programming is based on Java programming language so if you have a basic understanding of Java programming then it will be a fun to learn Android application development.
Android is an open source and Linux-based Operating System for mobile devices such as smartphones and tablet computers. Android was developed by the Open Handset Alliance, led by Google, and other companies.
Android offers a unified approach to application development for mobile devices which means developers need only develop for Android, and their applications should be able to run on different devices powered by Android.
The first beta version of the Android Software Development Kit (SDK) was released by Google in 2007 whereas the first commercial version, Android 1.0, was released in September 2008.”
10.“The article discusses the relation of social bots to ethical norms and legal culpability in a procedure for social media known as Bot Ethics. Topics include the relation of laws to ethical standards for behavior, the use of deception by bots, and the assignment of responsibility for bot behavior to developers. A bot (short for "robot") is an automated program that runs over the Internet. Some bots run automatically, while others only execute commands when they receive specific input. There are many different types of bots, but some common examples include web crawlers, chat room bots, and malicious bots.
While most bots are used for productive purposes, some are considered malware, since they perform undesirable functions. For example, spambots capture email addresses from website contact forms, address books, and email programs, then add them to a spam mailing list. Site scrapers download entire websites, enabling unauthorized duplication of a website's contents. DoS bots send automated requests to websites, making them unresponsive. Botnets, which consist of many bots working together, may be used to gain unauthorized access to computer systems and infect computers with viruses.
Web crawlers, for instance, are used by search engines to scan websites on a regular basis. These bots "crawl" websites by following the links on each page. The crawler saves the contents of each page in the search index. By using complex algorithms, search engines can display the most relevant pages discovered by web crawlers for specific search queries.”