(
AI) refers to the capability of
computational systemsto perform tasks typically associated with
human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. It is a
field of researchin
computer sciencethat develops and studies methods and
softwarethat enable machines to
perceive their environmentand use
learningand
intelligenceto take actions that maximize their chances of achieving defined goals. Such machines may be called AIs.
High-profile applications of AI include advanced web search engines (e.g., Google Search); recommendation systems (used by YouTube, Amazon, and Netflix); virtual assistants (e.g., Google Assistant, Siri, and Alexa); autonomous vehicles (e.g., Waymo); generative and creative tools (e.g., ChatGPT and AI art); and superhuman play and analysis in strategy games (e.g., chess and Go). However, many AI applications are not perceived as AI: "A lot of cutting edge AI has filtered into general applications, often without being called AI because once something becomes useful enough and common enough it's not labeled AI anymore."
Various subfields of AI research are centered around particular goals and the use of particular tools. The traditional goals of AI research include learning, reasoning, knowledge representation, planning, natural language processing, perception, and support for robotics. General intelligence—the ability to complete any task performed by a human on an at least equal level—is among the field's long-term goals. To reach these goals, AI researchers have adapted and integrated a wide range of techniques, including search and mathematical optimization, formal logic, artificial neural networks, and methods based on statistics, operations research, and economics. AI also draws upon psychology, linguistics, philosophy, neuroscience, and other fields. (Full article...)
(February 24, 1955 – October 5, 2011) was an American businessman, inventor, and investor best known for co-founding the technology company
Apple Inc.Jobs was also the founder of
NeXTand chairman and majority shareholder of
Pixar. He was a pioneer of the
personal computer revolutionof the 1970s and 1980s, along with his early business partner and fellow Apple co-founder
Steve Wozniak.
Jobs was born in San Francisco in 1955 and adopted shortly afterwards. He attended Reed College in 1972 before withdrawing that same year. In 1974, he traveled through India, seeking enlightenment before later studying Zen Buddhism. He and Wozniak co-founded Apple in 1976 to further develop and sell Wozniak's Apple I personal computer. Together, the duo gained fame and wealth a year later with production and sale of the Apple II, one of the first highly successful mass-produced microcomputers.
Jobs saw the commercial potential of the Xerox Alto in 1979, which was mouse-driven and had a graphical user interface (GUI). This led to the development of the largely unsuccessful Apple Lisa in 1983, followed by the breakthrough Macintosh in 1984, the first mass-produced computer with a GUI. The Macintosh launched the desktop publishing industry in 1985 (for example, the Aldus Pagemaker) with the addition of the Apple LaserWriter, the first laser printer to feature vector graphics and PostScript. (Full article...)
A linker or link editor is a computer program that combines intermediate software build files such as object and library files into a single executable file such as a program or library. A linker is often part of a toolchain that includes a compiler and/or assembler that generates intermediate files that the linker processes. The linker may be integrated with other toolchain tools such that the user does not interact with the linker directly.
A simpler version that writes its output directly to memory is called the loader, though loading is typically considered a separate process. (Full article...)
are used for controlling the behavior of a machine (often a
computer). Like
natural languages, programming languages follow rules for
syntaxand
semantics.
There are thousands of programming languages and new ones are created every year. Few languages ever become sufficiently popular that they are used by more than a few people, but professional programmers may use dozens of languages in a career.
Most programming languages are not standardized by an international (or national) standard, even widely used ones, such as Perl or Standard ML (despite the name). Notable standardized programming languages include ALGOL, C, C++, JavaScript (under the name ECMAScript), Smalltalk, Prolog, Common Lisp, Scheme (IEEE standard), ISLISP, Ada, Fortran, COBOL, SQL, and XQuery. (Full article...)
is a
high-level,
general-purpose,
memory-safe,
object-oriented programming language. It is intended to let
programmers write once, run anywhere(
WORA), meaning that
compiledJava code can run on all platforms that support Java without the need to recompile. Java applications are typically compiled to
bytecodethat can run on any
Java virtual machine(JVM) regardless of the underlying
computer architecture. The
syntaxof Java is similar to
Cand
C++, but has fewer
low-levelfacilities than either of them. The Java runtime provides dynamic capabilities (such as
reflectionand runtime code modification) that are typically not available in traditional compiled languages.
Java gained popularity shortly after its release, and has been a popular programming language since then. Java was the third most popular programming language in 2022[update] according to GitHub. Although still widely popular, there has been a gradual decline in use of Java in recent years with other languages using JVM gaining popularity.
Java was designed by James Gosling at Sun Microsystems. It was released in May 1995 as a core component of Sun's Java platform. The original and reference implementation Java compilers, virtual machines, and class libraries were released by Sun under proprietary licenses. As of May 2007, in compliance with the specifications of the Java Community Process, Sun had relicensed most of its Java technologies under the GPL-2.0-only license. Oracle, which bought Sun in 2010, offers its own HotSpot Java Virtual Machine. However, the official reference implementation is the OpenJDK JVM, which is open-source software used by most developers and is the default JVM for almost all Linux distributions. (Full article...)
The
history of artificial intelligence(
AI) began in
antiquity, with myths, stories, and rumors of artificial beings endowed with intelligence or consciousness by master craftsmen. The study of logic and formal reasoning from antiquity to the present led directly to the invention of the
programmable digital computerin the 1940s, a machine based on abstract mathematical reasoning. This device and the ideas behind it inspired scientists to begin discussing the possibility of building an
electronic brain.
The field of AI research was founded at a workshop held on the campus of Dartmouth College in 1956. Attendees of the workshop became the leaders of AI research for decades. Many of them predicted that machines as intelligent as humans would exist within a generation. The U.S. government provided millions of dollars with the hope of making this vision come true.
Eventually, it became obvious that researchers had grossly underestimated the difficulty of this feat. In 1974, criticism from James Lighthill and pressure from the U.S.A. Congress led the U.S. and British Governments to stop funding undirected research into artificial intelligence. Seven years later, a visionary initiative by the Japanese Government and the success of expert systems reinvigorated investment in AI, and by the late 1980s, the industry had grown into a billion-dollar enterprise. However, investors' enthusiasm waned in the 1990s, and the field was criticized in the press and avoided by industry (a period known as an "AI winter"). Nevertheless, research and funding continued to grow under other names. (Full article...)
is a
high-level,
general-purpose,
interpreted,
dynamic programming language. Though Perl is not officially an acronym, there are various
backronymsin use, including "Practical Extraction and Reporting Language".
Perl was developed by Larry Wall in 1987 as a general-purpose Unix scripting language to make report processing easier. Since then, it has undergone many changes and revisions. Perl originally was not capitalized and the name was changed to being capitalized by the time Perl 4 was released. The latest release is Perl 5, first released in 1994. From 2000 to October 2019 a sixth version of Perl was in development; the sixth version's name was changed to Raku. Both languages continue to be developed independently by different development teams which liberally borrow ideas from each other.
Perl borrows features from other programming languages including C, sh, AWK, and sed. It provides text processing facilities without the arbitrary data-length limits of many contemporary Unix command line tools. Perl is a highly expressive programming language: source code for a given algorithm can be short and highly compressible. (Full article...)
(
; an
acronymfor "common business-oriented language") is a
compiledEnglish-like
computer programming languagedesigned for business use. It is an
imperative,
procedural, and, since 2002,
object-orientedlanguage. COBOL is primarily used in business, finance, and administrative systems for companies and governments. COBOL is still widely used in applications deployed on
mainframe computers, such as large-scale
batchand
transaction processingjobs. Many large financial institutions were developing new systems in the language as late as 2006, but most programming in COBOL today is purely to maintain existing applications. Programs are being moved to new platforms, rewritten in modern languages, or replaced with other software.
COBOL was designed in 1959 by CODASYL and was partly based on the programming language FLOW-MATIC, designed by Grace Hopper. It was created as part of a U.S. Department of Defense effort to create a portable programming language for data processing. It was originally seen as a stopgap, but the Defense Department promptly pressured computer manufacturers to provide it, resulting in its widespread adoption. It was standardized in 1968 and has been revised five times. Expansions include support for structured and object-oriented programming. The current standard is ISO/IEC 1989:2023.
COBOL statements have prose syntax such as MOVE x TO y
, which was designed to be self-documenting and highly readable. However, it is verbose and uses over 300 reserved words compared to the succinct and mathematically inspired syntax of other languages. (Full article...)
(
; born August 11, 1950), also known by his nickname
Woz, is an American technology entrepreneur,
electrical engineer,
computer programmer, philanthropist, and inventor. In 1976, he co-founded
Apple Computerwith his early business partner
Steve Jobs. Through his work at Apple in the 1970s and 1980s, he is widely recognized as one of the most prominent pioneers of the
personal computer revolution.
In 1975, Wozniak started developing the Apple I into the computer that launched Apple when he and Jobs first began marketing it the following year. He was the primary designer of the Apple II, introduced in 1977, known as one of the first highly successful mass-produced microcomputers, while Jobs oversaw the development of its foam-molded plastic case and early Apple employee Rod Holt developed its switching power supply.
With human–computer interface expert Jef Raskin, Wozniak had a major influence over the initial development of the original Macintosh concepts from 1979 to 1981, when Jobs took over the project following Wozniak's brief departure from the company due to a traumatic airplane accident. After permanently leaving Apple in 1985, Wozniak founded CL 9 and created the first programmable universal remote, released in 1987. He then pursued several other businesses and philanthropic ventures throughout his career, focusing largely on technology in K–12 schools. (Full article...)
is a
general-purpose scripting languagegeared towards
web development. It was originally created by
Danish-Canadian programmer Rasmus Lerdorfin 1993 and released in 1995. The PHP
reference implementationis now produced by the PHP Group. PHP was originally an abbreviation of
Personal Home Page, but it now stands for the
recursive acronym PHP: Hypertext Preprocessor.
PHP code is usually processed on a web server by a PHP interpreter implemented as a module, a daemon or a Common Gateway Interface (CGI) executable. On a web server, the result of the interpreted and executed PHP code—which may be any type of data, such as generated HTML or binary image data—would form the whole or part of an HTTP response. Various web template systems, web content management systems, and web frameworks exist that can be employed to orchestrate or facilitate the generation of that response. Additionally, PHP can be used for many programming tasks outside the web context, such as standalone graphical applications and drone control. PHP code can also be directly executed from the command line.
The standard PHP interpreter, powered by the Zend Engine, is free software released under the PHP License. PHP has been widely ported and can be deployed on most web servers on a variety of operating systems and platforms. (Full article...)
Parallel computing is a type of computation in which many calculations or processes are carried out simultaneously. Large problems can often be divided into smaller ones, which can then be solved at the same time. There are several different forms of parallel computing: bit-level, instruction-level, data, and task parallelism. Parallelism has long been employed in high-performance computing, but has gained broader interest due to the physical constraints preventing frequency scaling. As power consumption (and consequently heat generation) by computers has become a concern in recent years, parallel computing has become the dominant paradigm in computer architecture, mainly in the form of multi-core processors.
In computer science, parallelism and concurrency are two different things: a parallel program uses multiple CPU cores, each core performing a task independently. On the other hand, concurrency enables a program to deal with multiple tasks even on a single CPU core; the core switches between tasks (i.e. threads) without necessarily completing each one. A program can have both, neither or a combination of parallelism and concurrency characteristics.
Parallel computers can be roughly classified according to the level at which the hardware supports parallelism, with multi-core and multi-processor computers having multiple processing elements within a single machine, while clusters, MPPs, and grids use multiple computers to work on the same task. Specialized parallel computer architectures are sometimes used alongside traditional processors, for accelerating specific tasks. (Full article...)
An animation of the
quicksort algorithmsorting an array of randomized values
at the UNIVAC keyboard, c. 1960. Grace Brewster Murray: American mathematician and rear admiral in the U.S. Navy who was a pioneer in developing computer technology, helping to devise UNIVAC I. the first commercial electronic computer, and naval applications for COBOL (common-business-oriented language).
Shell, GNOME Clocks, Evince, gThumb and GNOME Files at version 3.30, in a dark theme
Animation of a
Non-uniform rational B-splinesurface. Modeled and rendered in
Cobalt.
A screenshot of
GNU Emacs22.0.91.1, from
Ubuntu’s emacs-snapshot-gtk package.
standing next to the navigation software that she and her MIT team produced for the
Apollo Project.
Partial map of the
Internetbased on the January 15, 2005 data found on opte.org. Each line is drawn between two nodes, representing two IP addresses. The length of the lines are indicative of the delay between those two nodes. This graph represents less than 30% of the Class C networks reachable by the data collection program in early 2005.
Image 8Output from a (linearised) shallow water equation model of water in a bathtub. The water experiences 5 splashes which generate surface gravity waves that propagate away from the splash locations and reflect off of the bathtub walls.
An IBM Port-A-Punch
punched cardis a British-American computer scientist, physicist, and businessman. He is known for his work in computer science, mathematics, and in theoretical physics.
A
head crashon a modern hard disk drive
was a
chess-playing expert systemrun on a unique purpose-built
IBM supercomputer. It was the first computer to win a
game, and the first to win a match, against a reigning world champion under regular time controls. Photo taken at the
Computer History Museum.
A lone house. An image made using
Blender 3D.
This image (when viewed in full size, 1000 pixels wide) contains 1 million
pixels, each of a different color.
's
Glider Gunin action
was an English mathematician and writer, chiefly known for her work on
Charles Babbage's proposed mechanical general-purpose
computer, the
Analytical Engine. She was the first to recognize that the machine had applications beyond pure calculation, and to have published the first
algorithmintended to be carried out by such a machine. As a result, she is often regarded as the first
computer programmer.
A view of the
GNU nanoText editor version 6.0
Partial view of the
Mandelbrot set. Step 1 of a zoom sequence: Gap between the "head" and the "body" also called the "seahorse valley".
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.3