Philip M. Parker, a marketing professor at INSEAD (the European Institute of Business Administration), has written and patented a system that uses an algorithm to automatically compile data into book form. Between his works and those of his research group (ICON Group International), he has over 900,000 books currently for sale on Amazon. More than a smart search engine, his system only requires a few minutes or a few hours to scan the databases relevant to any given topic and organize that data into a technical report. Next stop? Romance novels.
There are few things in life quite as boring as writing a technical report. You accumulate all available data on the topic, then categorize and prioritize the information. A general structure within with which to present the data is then chosen from a few common structures, whereupon the collected and sorted information is presented as a report. Such reports are formulaic – produced in accordance with a slavishly followed rule or style, and their generation is largely a process of intellectual drudgery requiring very little creativity. It's exactly the sort of task for which computers were developed.
Prof. Parker, an author of several conventionally written technical and business reports, realized one day that the process of writing such a report can be described in terms of a reasonably well defined algorithm. He then set out to program a computer to carry out this algorithm, for which he was issued US Patent 7,266,767 (Method and apparatus for automated authoring and marketing.).
Parker designed the algorithm to follow (hopefully closely) the path that an expert would take in writing a summary about a data-rich subject. There are similarities to IBM's Jeopardy grandmaster Watson, which also casts a wide data net, then organizes and summarizes the data so it can respond rapidly to questions.
Some examples of books written by Parker's program include:
- Satirists: Webster's Quotations, Facts and Phrases
While some of these titles seem difficult to believe, there is a (generally small) market for each of them. If a person or company needs in-depth data about a subject, however narrow, that data has value in organized form. Parker's program works especially well with modern distribution technologies, turning print-on-demand into written-on-demand.
Parker has also used an outgrowth of his algorithm to write a comprehensive set of poems about roughly 80,000 words in the English language. Totopoetry is a collection of algorithmically authored poetry that neatly illustrates the strengths and limitations of algorithmic writing. Poems are written in 17 styles (e.g., Haiku, limerick, sonnet) for each word in English.
Each poem is intended to illuminate the meaning of the word on which it is based. For example, the octosyllable poem for "poetry" reads:
"Really instant and overt.
 But; also distant and covert." – Totopoetry
And then there is the modern Haiku form, again on "poetry":
"An executive,
 evokes; many directions,
 numbers; and manners" – Totopoetry
At present Parker's algorithm cannot judge its work against some intrinsic or personal measure of poetic merit.
The next area of formulaic writing to which Parker wants to adapt his algorithm is romance novels, which are widely (perhaps unfairly) denigrated as "cookie-cutter" literature. Parker believes their simplicity and limited plot structure suggest romances as the best target for an early attack on fiction writing. Regardless of his level of success, human authors are likely to face progressively more competition from algorithmic authors over the next decade or so. At this point it seems likely that the place of the best human writers is probably safe, but for how long? Time will tell.
Source: ExtremeTech.com
The article above is rather dated in terms of it covering recent developments. The following may be of interest, wrt your comment on hummanity:
http://gulfnews.com/news/gulf/uae/education/campus-in-abu-dhabi-helps-make-farming-easier-with-new-radio-technology-1.1041606
People have been doing this on the internet with auto-generated blogs that have lots of embedded ads since 2000. Basically they give it a keyword, the software searches the internet and finds lots of relevant content and uses some algorithm to generate a new blog site with say 100 randomly generated pages on the topic.
These blogs offer zero value however. You can usually spot them from reading a single paragraph. It just pollutes the internet. Thankfully Google and others have gotten pretty good and filtering them out.
Seems like the same thing is going to happen to ebook stores.