20131012

mencari uang di ptc

selamat siang agan wan ama agan wati yang kece kece nih....
saya ingin berbagi ptc ptc yang bener bener ngejamin dan gak scam ..
sebelumnya ptc itu adalah apa tau kepanjangannya, tapi intinya di ptc itu kita cuma ngklik iklan trus dibayar.. gitu doang gan,  ciuss
tapi emang harus kerja keras gan semua usaha ituh.. kita akan di beri bayaran perklik itu 0,001 dollar gan, itung aja klo setiap hari dikasih jatah iklan 30 kali trus kaliin ama sebulan..hhe lumayan lah cuman ngeklik doang sih... trus banyak cara lain juga gan untuk nambah lebih banyak pendapatannya .. tapi pasti harus step by step kan,,, klo di iming iming langsung dapet duit banyak mah boong ....

yaudah nih saya kasih tau gan apa aja.. langsung aja gan daftar keburu kebanyakan orang yang tau,,, bisa klik benner dibawah nanti ,,, okehhh

01
http://www.neobux.com/?r=surayvoices

probux gan yang pertama ,,, langsung aja daftar tinggal klik dioang benner diatas atau


02
yang kedua ini namanya neobux gan,,, langsung daftar lagi aja gan klik benner di atas

 03

yang ketiga namanya cliksense ..mantap gan... yaudah buruan klik bennernya


sorry gan klo acak acakan nulisnya ,,, tapi isinya beneran gan,,, hahah,, biasa baru di blog,,,hehe

komennya yah



20130225

computer




A computer is a programmable machine that receives input, stores and manipulates data//information, and provides output in a useful format.

While a computer can, in theory, be made out of almost anything (see misconceptions section), and mechanical examples of computers have existed through much of recorded human history, the first electronic computers were developed in the mid-20th century (1940–1945). Originally, they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs). Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space. Simple computers are small enough to fit into mobile devices, and can be powered by a small battery. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from MP3 players to fighter aircraft and from toys to industrial robots are the most numerous

Using the Internet to Find Your Niche

The Internet is a wonderful place where users can find a great deal of information. However, many are not aware that the Internet is also where some savvy entrepreneurs can find lucrative business opportunities. Internet niche marketing is just one example of how those in the know can turn their hard work and dedication into profit. This is not to say that Internet marketing is a simple field where anyone can prosper but there are opportunities for those how are willing to persevere in their efforts.

Learning Internet Marketing Online

Believe it or not Internet niche market is a subject that can be learned online. It certainly helps for those who hope to prosper in this industry to have some knowledge of marketing and business before venturing in an Internet niche marketing campaign but it is not necessary. There is a great deal of information on organizing and executing a niche marketing campaign available online. This information may come in a number of different forms including websites offering informative articles, message boards focusing on the industry and ebooks which are available free of charge or for a fee.

Let’s first examine learning about Internet marketing through websites. Type the search term, “Internet niche marketing” into your favorite search engine and you will likely receive millions of search results. Shifting through all of the search results would be rather time consuming and many of them would likely not be relevant. Fortunately the search engines do a great deal of work for you and the most useful websites will likely appear on the first couple of pages of search results. This still leaves you with a great deal of information to sort through but considering you are likely planning to turn niche marketing into a career this research is certainly worthwhile.

Carefully examine the search results you obtain from your search and bookmark the websites which seem most useful. Next take as much time as necessary to comb through all of these websites to find the most useful information. Take notes as you do to create a comprehensive resource for yourself. After this review your notes and investigate items which seem unclear to your further. This research may include offline resources such as books or phone calls to experts in the industry.

Search Engines are Your Friend

Now that you have already used the Internet to learn about the industry of Internet marketing, you probably know that finding a profitable niche is imperative. A niche is essentially a specific area of interest. Ideally you will already be an expert in this subject and it will be a subject which has a wide Internet audience without a great deal of existing websites focusing on this niche. Once again, you can turn to the Internet for finding this niche.

You may already have a few ideas for niches. These are probably subjects you are passionate about and understand very well. Examining statistical information provided by popular search engines regarding the popularity of search terms related to your niche will give you a good indication of whether or not Internet users are interested in your niche. If keywords related to your niche are searched on these search engines often, it is evident there is an audience for your niche. Next it is time to enter these keywords in a search engine and evaluate the websites which are provided as results for these keywords. If there are many strong results the niche can be considered saturated. In this case, it is a good idea to abandon the original idea and search for another niche. However, if there are not many high quality search results, you may have found your perfect niche.

Article Source: http://www.articlecity.com/articles/computers_and_internet/article_5652.shtml

Asus launches world's thinnest notebook

Asus finally launching a netbook product called the thinnest and lightest in the world. With a thickness of 17.6 millimeters and weighs 920 grams totaled, X101 series product is challenging thin laptops that are already on the market without sacrificing functionality. 

This product has actually been exhibited in celebration of International Computer, Computex 2011, Taipei, Taiwan, a few months ago. The design is lightweight and slim netbook is meant to fit easily into a briefcase and can be carried easily anywhere.

This netbook allegedly deliberately made ​​for cloud computing to boost the ability to surf the internet. The proof lies in the use of Solid State Drives at 8 Gigabit without any harddisks.

The operating system used is MeeGo which basically has the functionality to work through the cloud computing as well as access to exclusive services such as Asus Vibe Asus, Asus AppStore, until online file storage facility that is Dropbox.

Specifications X101 is also not lost though initially targeted for netbook users and people who want to have a secondary computer. Processor which is embedded in this netbook is the Intel Atom, 1 Gb RAM, 2 USB 2.0 plugs, a micro SD slot, and a webcam. Innovations such as the Asus Super Hybrid Engine proud ensure the continued efficient performance of netbook so that it can save battery usage.

20130222

The Blits From Turus 

Personil :
  1. Oges 
  2. Dhezee
  3. Iaz
  4. Ukit
  5. Enay
  6. Enoz
  7. Hafidz
  8. Decky
  9. Iben
  10. Do noel
  11. Ahmed
  12. Rakha
  13. Grey
  14. Cibex
  15. Sepri
  16. Rifhan
  17. Murfy
  18. Fikri
  19. Yadi
  20. Qowi
  21. Fatah
  22. Ari
  23. Rony
  24. Fiqih


20130221

Portable Computers

As it turned out the idea of a laptop-like portable computer existed even before it was possible to create one, and it was developed at Xerox PARC by Alan Kay whom called it the Dynabook and intended it for children. The first portable computer that was created was the Xerox Notetaker, but only 10 were produced. The first laptop that was commercialized was Osborne 1 in 1981, with a small 5″ CRT monitor and a keyboard that sits inside of the lid when closed. It ran CP/M (the OS that Microsoft bought and based DOS on). Later portable computers included Bondwell 2 released in 1985, also running CP/M, which was among the first with a hinge-mounted LCD display. Compaq Portable was the first IBM PC compatible computer, and it ran MS-DOS, but was less portable than Bondwell 2. Other examples of early portable computers included Epson HX-20, GRiD compass, Dulmont Magnum, Kyotronic 85, Commodore SX-64, IBM PC Convertible, Toshiba T1100, T1000, and T1200 etc. The first portable computers which resemble modern laptops in features were Apple’s Powerbooks, which first introduced a built-in trackball, and later a trackpad and optional color LCD screens. IBM’s ThinkPad was largely inspired by Powerbook’s design, and the evolution of the two led to laptops and notebook computers as we know them. Powerbooks were eventually replaced by modern MacBook Pro’s. Of course, much of the evolution of portable computers was enabled by the evolution of microprocessors, LCD displays, battery technology and so on. This evolution ultimately allowed computers even smaller and more portable than laptops, such as PDAs, tablets, and smartphones.

Graphical User Interface (GUI)

Possibly the most significant of those shifts was the invention of the graphical user interface, and the mouse as a way of controlling it. Doug Engelbart and his team at the Stanford Research Lab developed the first mouse, and a graphical user interface, demonstrated in 1968. They were just a few years short of the beginning of the personal computer revolution sparked by the Altair 8800 so their idea didn’t take hold. Instead it was picked up and improved upon by researchers at the Xerox PARC research center, which in 1973 developed Xerox Alto, the first computer with a mouse-driven GUI. It never became a commercial product, however, as Xerox management wasn’t ready to dive into the computer market and didn’t see the potential of what they had early enough. It took Steve Jobs negotiating a stocks deal with Xerox in exchange for a tour of their research center to finally bring the user friendly graphical user interface, as well as the mouse, to the masses. Steve Jobs was shown what Xerox PARC team had developed, and directed Apple to improve upon it. In 1984 Apple introduced the Macintosh, the first mass-market computer with a graphical user interface and a mouse. Microsoft later caught on and produced Windows, and the historic competition between the two companies started, resulting in improvements to the graphical user interface to this day. Meanwhile IBM was dominating the PC market with their IBM PC, and Microsoft was riding on their coat tails by being the one to produce and sell the operating system for the IBM PC known as “DOS” or “Disk Operating System”. Macintosh, with its graphical user interface, was meant to dislodge IBM’s dominance, but Microsoft made this more difficult with their PC-compatible Windows operating system with its own GUI.

Second Generation Microcomputers (1977 – present)

As microcomputers continued to evolve they became easier to operate, making them accessible to a larger audience. They typically came with a keyboard and a monitor, or could be easily connected to a TV, and they supported visual representation of text and numbers on the screen. In other words, lights and switches were replaced by screens and keyboards, and the necessity to understand binary code was diminished as they increasingly came with programs that could be used by issuing more easily understandable commands. Famous early examples of such computers include Commodore PET, Apple II, and in the 80s the IBM PC. The nature of the underlying electronic components didn’t change between these computers and modern computers we know of today, but what did change was the number of circuits that could be put onto a single microchip. Intel’s co-founder Gordon Moore predicted the doubling of the number of transistor on a single chip every two years, which became known as “Moore’s Law”, and this trend has roughly held for over 30 years thanks to advancing manufacturing processes and microprocessor designs. The consequence was a predictable exponential increase in processing power that could be put into a smaller package, which had a direct effect on the possible form factors as well as applications of modern computers, which is what most of the forthcoming paradigm shifting innovations in computing were about.

First Generation of Microcomputers (1971 – 1976)

First microcomputers were a weird bunch. They often came in kits, and many were essentially just boxes with lights and switches, usable only to engineers and hobbyists whom could understand binary code. Some, however, did come with a keyboard and/or a monitor, bearing somewhat more resemblance to modern computers. It is arguable which of the early microcomputers could be called a first. CTC Datapoint 2200 is one candidate, although it actually didn’t contain a microprocessor (being based on a multi-chip CPU design instead), and wasn’t meant to be a standalone computer, but merely a terminal for the mainframes. The reason some might consider it a first microcomputer is because it could be used as a de-facto standalone computer, it was small enough, and its multi-chip CPU architecture actually became a basis for the x86 architecture later used in IBM PC and its descendants. Plus, it even came with a keyboard and a monitor, an exception in those days. However, if we are looking for the first microcomputer that came with a proper microprocessor, was meant to be a standalone computer, and didn’t come as a kit then it would be Micral N, which used Intel 8008 microprocessor. Popular early microcomputers which did come in kits include MOS Technology KIM-1, Altair 8800, and Apple I. Altair 8800 in particular spawned a large following among the hobbyists, and is considered the spark that started the microcomputer revolution, as these hobbyists went on to found companies centered around personal computing, such as Microsoft, and Apple.

Fourth Generation Computers (1971 – present)

First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004. The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today.

Third Generation Computers (1960s)

The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on. First appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBM’s System/360. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew.

Second Generation Computers (1955 – 1960)

The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards. The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC.

First Generation Computers (1940s – 1950s)

First electronic computers used vacuum tubes, and they were huge and complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up 167 square meters, weighed 27 tons, and consuming 150 kilowatts of power. It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors. The first non-general purpose computer was ABC (Atanasoff–Berry Computer), and other similar computers of this era included german Z3, ten British Colossus computers, LEO, Harvard Mark I, and UNIVAC.

what is computer

In its most basic form a computer is any device which aids humans in performing various kinds of computations or calculations. In that respect the earliest computer was the abacus, used to perform basic arithmetic operations. Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information, the computer processes it according to its basic logic or the program currently running, and outputs the results. Modern computers do this electronically, which enables them to perform a vastly greater number of calculations or computations in less time. Despite the fact that we currently use computers to process images, sound, text and other non-numerical forms of data, all of it depends on nothing more than basic numerical calculations. Graphics, sound etc. are merely abstractions of the numbers being crunched within the machine; in digital computers these are the ones and zeros, representing electrical on and off states, and endless combinations of those. In other words every image, every sound, and every word have a corresponding binary code. While abacus may have technically been the first computer most people today associate the word “computer” with electronic computers which were invented in the last century, and have evolved into modern computers we know of today.

james watt

James Watt, the eldest surviving child of eight children, five of whom died in infancy, of James Watt (1698–1782) and his wife, Agnes Muirhead (1703–1755), was born in Greenock on 19th January, 1736. His father was a successful merchant. According to his biographer, Jennifer Tann: "James Watt was a delicate child and suffered from frequent headaches during his childhood and adult life. He was taught at home by his mother at first, then was sent to M'Adam's school in Greenock. He later went to Greenock grammar school where he learned Latin and some Greek but was considered to be slow. However, on being introduced to mathematics, he showed both interest and ability." At the age of nineteen he was sent to Glasgow to learn the trade of a mathematical-instrument maker. After spending a year in London, Watt returned to Scotland in 1757 where he established his own instrument-making business. Watt soon developed a reputation as a high quality engineer and was employed on the Forth & Clyde Canal and the Caledonian Canal. He was also engaged in the improvement of harbours and in the deepening of the Forth, Clyde and other rivers in Scotland. In 1763 Watt was sent a Newcomen steam engine to repair. While putting it back into working order, Watt discovered how he could make the engine more efficient. Watt worked on the idea for several months and eventually produced a steam engine that cooled the used steam in a condenser separate from the main cylinder. James Watt was not a wealthy man so he decided to seek a partner with money. James Watt was not a wealthy man so he asked John Roebuck to provide financial backing for the project. Roebuck agreed and the two men went into partnership. Roebuck held two-thirds of the original patent (9th January 1769) in return for discharging some of Watt's debts. In March 1773 Roebuck became bankrupt. At the time he owed Matthew Boulton over £1,200. Boulton knew about Watt's research and wrote to him making an offer for Roebuck's share in the steam-engine. Roebuck refused but on 17th May, he changed his mind and accepted Boulton's terms. James Watt was also owed money by Roebuck, but as he had done a deal with his friend, he wrote a formal discharge "because I think the thousand pounds he (Boulton) he has paid more than the value of the property of the two thirds of the inventions." For the next eleven years Boulton's factory producing and selling Watt's steam-engines. These machines were mainly sold to colliery owners who used them to pump water from their mines. Watt's machine was very popular because it was four times more powerful than those that had been based on the Thomas Newcomen design. Watt continued to experiment and in 1781 he produced a rotary-motion steam engine. Whereas his earlier machine, with its up-and-down pumping action, was ideal for draining mines, this new steam engine could be used to drive many different types of machinery. Richard Arkwright was quick to importance of this new invention, and in 1783 he began using Watt's steam-engine in his textile factories. Others followed his lead and by 1800 there were over 500 of Watt's machines in Britain's mines and factories. Eric Hobsbawm, the author of The Age of Revolution (1962) has argued: "Fortunately few intellectual refinements were necessary to make the Industrial Revolution. Its technical inventions were exceedingly modest, and in no way beyond the scope of intelligent artisans experimenting in their workshops, or of the constructive capacities of carpenters, millwrights, and locksmiths: the flying shuttle, the spinning jenny, the mule. Even its scientifically most sophisticated machine, James Watt's rotary steam-engine (1784), required no more physics than had been available for the best part of a century." Arthur Young pointed out in his book, From Birmingham to Suffolk (1791): "What trains of thought, what a spirit of exertion, what a mass and power of effort have sprung in every path of life, from the works of such men as Brindley, Watt, Priestley, Harrison, Arkwright.... In what path of life can a man be found that will not animate his pursuit from seeing the steam-engine of Watt?" Watt became a member of the Lunar Society of Birmingham. The group took this name because they used to meet to dine and converse on the night of the full moon. Other members included Matthew Boulton, Josiah Wedgwood, Joseph Priestley, Thomas Day, William Small, John Whitehurst, William Withering, Richard Lovell Edgeworth and Erasmus Darwin. The historian, Jenny Uglow, has argued: "The members of the Lunar Society were brilliant representatives of the informal scientific web that cut across class, blending the inherited skills of craftsmen with the theoretical advances of scholars, a key factor in British manufacturing's leap ahead of the rest of Europe. Most had been entranced by mechanics in childhood in the 1730s and 1740s, when itinerant lecturers toured the country displaying electrical and mechanical marvels." In 1755 Watt had been granted a patent by Parliament that prevented anybody else from making a steam-engine like the one he had developed. For the next twenty-five years, the Boulton & Watt company had a virtual monopoly over the production of steam-engines. Watt charged his customers a premium for using his steam engines. To justify this he compared his machine to a horse. Watt calculated that a horse exerted a pull of 180 lb., therefore, when he made a machine, he described its power in relation to a horse, i.e. "a 20 horse-power engine". Watt worked out how much each company saved by using his machine rather than a team of horses. The company then had to pay him one third of this figure every year, for the next twenty-five years. James Watt died at Heathfield in Handsworth, Birmingham, on 25th August 1819 and was buried beside Matthew Boulton in St Mary's Church on 2nd September. We offer guaranteed success for 642-374 with help of latest ccda dumps and 642-617 practice questions and the exams of 646-364 & 650-251.

Biografi Albert Einstein

Albert Einstein was born at Ulm, in Württemberg, Germany, on March 14, 1879. Six weeks later the family moved to Munich, where he later on began his schooling at the Luitpold Gymnasium. Later, they moved to Italy and Albert continued his education at Aarau, Switzerland and in 1896 he entered the Swiss Federal Polytechnic School in Zurich to be trained as a teacher in physics and mathematics. In 1901, the year he gained his diploma, he acquired Swiss citizenship and, as he was unable to find a teaching post, he accepted a position as technical assistant in the Swiss Patent Office. In 1905 he obtained his doctor's degree. During his stay at the Patent Office, and in his spare time, he produced much of his remarkable work and in 1908 he was appointed Privatdozent in Berne. In 1909 he became Professor Extraordinary at Zurich, in 1911 Professor of Theoretical Physics at Prague, returning to Zurich in the following year to fill a similar post. In 1914 he was appointed Director of the Kaiser Wilhelm Physical Institute and Professor in the University of Berlin. He became a German citizen in 1914 and remained in Berlin until 1933 when he renounced his citizenship for political reasons and emigrated to America to take the position of Professor of Theoretical Physics at Princeton*. He became a United States citizen in 1940 and retired from his post in 1945. After World War II, Einstein was a leading figure in the World Government Movement, he was offered the Presidency of the State of Israel, which he declined, and he collaborated with Dr. Chaim Weizmann in establishing the Hebrew University of Jerusalem. Einstein always appeared to have a clear view of the problems of physics and the determination to solve them. He had a strategy of his own and was able to visualize the main stages on the way to his goal. He regarded his major achievements as mere stepping-stones for the next advance. At the start of his scientific work, Einstein realized the inadequacies of Newtonian mechanics and his special theory of relativity stemmed from an attempt to reconcile the laws of mechanics with the laws of the electromagnetic field. He dealt with classical problems of statistical mechanics and problems in which they were merged with quantum theory: this led to an explanation of the Brownian movement of molecules. He investigated the thermal properties of light with a low radiation density and his observations laid the foundation of the photon theory of light. In his early days in Berlin, Einstein postulated that the correct interpretation of the special theory of relativity must also furnish a theory of gravitation and in 1916 he published his paper on the general theory of relativity. During this time he also contributed to the problems of the theory of radiation and statistical mechanics. In the 1920's, Einstein embarked on the construction of unified field theories, although he continued to work on the probabilistic interpretation of quantum theory, and he persevered with this work in America. He contributed to statistical mechanics by his development of the quantum theory of a monatomic gas and he has also accomplished valuable work in connection with atomic transition probabilities and relativistic cosmology. After his retirement he continued to work towards the unification of the basic concepts of physics, taking the opposite approach, geometrisation, to the majority of physicists. Einstein's researches are, of course, well chronicled and his more important works include Special Theory of Relativity (1905), Relativity (English translations, 1920 and 1950), General Theory of Relativity (1916), Investigations on Theory of Brownian Movement (1926), and The Evolution of Physics (1938). Among his non-scientific works, About Zionism (1930), Why War? (1933), My Philosophy (1934), and Out of My Later Years (1950) are perhaps the most important. Albert Einstein received honorary doctorate degrees in science, medicine and philosophy from many European and American universities. During the 1920's he lectured in Europe, America and the Far East and he was awarded Fellowships or Memberships of all the leading scientific academies throughout the world. He gained numerous awards in recognition of his work, including the Copley Medal of the Royal Society of London in 1925, and the Franklin Medal of the Franklin Institute in 1935. Einstein's gifts inevitably resulted in his dwelling much in intellectual solitude and, for relaxation, music played an important part in his life. He married Mileva Maric in 1903 and they had a daughter and two sons; their marriage was dissolved in 1919 and in the same year he married his cousin, Elsa Löwenthal, who died in 1936. He died on April 18, 1955 at Princeton, New Jersey. From Nobel Lectures, Physics 1901-1921, Elsevier Publishing Company, Amsterdam, 1967 This autobiography/biography was written at the time of the award and first published in the book series Les Prix Nobel. It was later edited and republished in Nobel Lectures. To cite this document, always state the source as shown above.

Abraham lincoln

Lincoln warned the South in his Inaugural Address: "In your hands, my dissatisfied fellow countrymen, and not in mine, is the momentous issue of civil war. The government will not assail you.... You have no oath registered in Heaven to destroy the government, while I shall have the most solemn one to preserve, protect and defend it." Lincoln thought secession illegal, and was willing to use force to defend Federal law and the Union. When Confederate batteries fired on Fort Sumter and forced its surrender, he called on the states for 75,000 volunteers. Four more slave states joined the Confederacy but four remained within the Union. The Civil War had begun. The son of a Kentucky frontiersman, Lincoln had to struggle for a living and for learning. Five months before receiving his party's nomination for President, he sketched his life: "I was born Feb. 12, 1809, in Hardin County, Kentucky. My parents were both born in Virginia, of undistinguished families--second families, perhaps I should say. My mother, who died in my tenth year, was of a family of the name of Hanks.... My father ... removed from Kentucky to ... Indiana, in my eighth year.... It was a wild region, with many bears and other wild animals still in the woods. There I grew up.... Of course when I came of age I did not know much. Still somehow, I could read, write, and cipher ... but that was all." Lincoln made extraordinary efforts to attain knowledge while working on a farm, splitting rails for fences, and keeping store at New Salem, Illinois. He was a captain in the Black Hawk War, spent eight years in the Illinois legislature, and rode the circuit of courts for many years. His law partner said of him, "His ambition was a little engine that knew no rest." He married Mary Todd, and they had four boys, only one of whom lived to maturity. In 1858 Lincoln ran against Stephen A. Douglas for Senator. He lost the election, but in debating with Douglas he gained a national reputation that won him the Republican nomination for President in 1860. As President, he built the Republican Party into a strong national organization. Further, he rallied most of the northern Democrats to the Union cause. On January 1, 1863, he issued the Emancipation Proclamation that declared forever free those slaves within the Confederacy. Lincoln never let the world forget that the Civil War involved an even larger issue. This he stated most movingly in dedicating the military cemetery at Gettysburg: "that we here highly resolve that these dead shall not have died in vain--that this nation, under God, shall have a new birth of freedom--and that government of the people, by the people, for the people, shall not perish from the earth." Lincoln won re-election in 1864, as Union military triumphs heralded an end to the war. In his planning for peace, the President was flexible and generous, encouraging Southerners to lay down their arms and join speedily in reunion. The spirit that guided him was clearly that of his Second Inaugural Address, now inscribed on one wall of the Lincoln Memorial in Washington, D. C.: "With malice toward none; with charity for all; with firmness in the right, as God gives us to see the right, let us strive on to finish the work we are in; to bind up the nation's wounds.... " On Good Friday, April 14, 1865, Lincoln was assassinated at Ford's Theatre in Washington by John Wilkes Booth, an actor, who somehow thought he was helping the South. The opposite was the result, for with Lincoln's death, the possibility of peace with magnanimity died. The Presidential biographies on WhiteHouse.gov are from “The Presidents of the United States of America,” by Michael Beschloss and Hugh Sidey. Copyright 2009 by the White House Historical Association. Learn more about Abraham Lincoln 's spouse, Mary Tod

Alexander The great (356 - 323 BC)

Alexander III of Macedon, better known as Alexander the Great, single-handedly changed the nature of the ancient world in little more than a decade.

Alexander was born in Pella, the ancient capital of Macedonia in July 356 BC. His parents were Philip II of Macedon and his wife Olympias. Alexander was educated by the philosopher Aristotle. Philip was assassinated in 336 BC and Alexander inherited a powerful yet volatile kingdom. He quickly dealt with his enemies at home and reasserted Macedonian power within Greece. He then set out to conquer the massive Persian Empire.

Against overwhelming odds, he led his army to victories across the Persian territories of Asia Minor, Syria and Egypt without suffering a single defeat. His greatest victory was at the Battle of Gaugamela, in what is now northern Iraq, in 331 BC. The young king of Macedonia, leader of the Greeks, overlord of Asia Minor and pharaoh of Egypt became 'great king' of Persia at the age of 25.

Over the next eight years, in his capacity as king, commander, politician, scholar and explorer, Alexander led his army a further 11,000 miles, founding over 70 cities and creating an empire that stretched across three continents and covered around two million square miles. The entire area from Greece in the west, north to the Danube, south into Egypt and as far to the east as the Indian Punjab, was linked together in a vast international network of trade and commerce. This was united by a common Greek language and culture, while the king himself adopted foreign customs in order to rule his millions of ethnically diverse subjects.

Alexander was acknowledged as a military genius who always led by example, although his belief in his own indestructibility meant he was often reckless with his own life and those of his soldiers. The fact that his army only refused to follow him once in 13 years of a reign during which there was constant fighting, indicates the loyalty he inspired.

He died of a fever in Babylon in June 323 BC.

20130212

Sendal Jepit

Ku cari-cari entah kemana
tiap hari ku memakaimu
tetapi kau selalu meninggalkanku
ku sangka kau setia padaku
tapi ternyata kaki lain menyentuhmu

kini ku kembali memakaimu
satu jam kemudian Kau kembali pergi

ingin rasanya ku tak memaikaimu
agar kau tak pergi jauh dariku
tapi apalah daya kaki ini sangat membutuhkanmu
akhirnya ku terpaksa berpaling darimu
mencari cinta sendal sejati,....

                                                                                               created by : fikri dzikrillah