1. Online Business:
Learn how to sell the best selling products online with a simple Internet storefront. NO INVENTORY! All products are drop-shipped directly to your customers door step with your business name on the box. You'll learn:How to create and register your own unique business nameHow to form your own Sole Proprietorship or Corporation.How to easily obtain your resellers license (sales tax license)How to get signed up with the top wholesale distributors in the U.S.
How to sell over 250,000 BRAND NEW top selling products online.How to choose which product niche to specialize in.How to create and register your own optimized domain name.How to choose a reliable web hosting provider and hosting plan.How to easily build your own online store or have it built for you.How to accept credit cards online with your own merchant account.How tpromote market and advertise your online store and more.Here you'll discover how to achieve financial success by selling the best selling products online. You'll finally understand why it's always best to build your own online store instead of joining an affiliate program or purchasing online store builder programs.
EARN A SUBSTANTIAL INCOME ONLINE"The average Internet storefront owner earns approximately $60,000 yearly from home, while dedicated entrepreneurs who consistently market their storefronts earn anywhere from $70,000 to over $100,000 while in their first year in business." Six-Figure incomenouncommoCONVENIENCEBecoming an internet storefront owner allows you tconduct all of your daily business transactions via the Internet which brings you the added benefits and advantages of working from the privacy of your own home which makes this the perfect home based business.
FLEXIBILITYYou can start off part-time or full-time. Work as many hours as you see fit. However, be advised that the main purpose of this home business plan is to help you make the full transition from your present full-time job to your new, online store internet business.QUICK START-UYou can start, build and set up your new internet storefront business in less then 2 weeks. You'll learn how you can create, design and promote a profitable Internet storefront which takes orders 24/7. Let your online store do all of the selling for you. With an online store you'll earn money while you sleep.
2. Online Jobs:
Dear Internet Friend, Welcome.
This is exclusively designed for Indians who wants to earn money through Home based internet jobs without any investment, who can spend only few hours in a day. Trust us, you will earn Rs. 50000 and more from this monthWhat is the actual work?The actual job is filling online data entry forms in the internet. We will be providing you simple Online Registration Forms. You have to fill those Online Registration Forms according to the instructions.
Is it easy to fill those registration forms?Yes. These are very simple data entry forms and it is similar to creating an Email account in yahoo, hotmail or rediff. It takes less than 2 minutes to fill those forms. This is similar to Data entry jobsHow much I can earn for filling each form?You can earn between Rs.50 to Rs.100 for each data entry form you fill. We have many categories of forms to fill. Based on the category, the rate will vary. On an average, you can earn Rs.75 for each form you fill.
Is this work available worldwide?No. This work is available only for Indians. In future, we will recruit peoples from all over the world. You just need basic internet browsing knowledge.What is the Qualification to do this workNo need Extra qualification. Just basic Internet Browsing knowledge is enough to do this job.How much time I need to work in the Internet?There is no hard and fast rules’ regarding the work. It depends on the number of hours you work. You can work at your convenient timings. But it is good at least you work 1 hour daily to earn a decent income.
When will I get the payment and what is the mode of payment?
You will be paid on or around 20th of every month for the previous month earnings. You will be paid only if your earnings is equal to or more than Rs.2000. If it is less than Rs.2000 for a particular month, then the amount will be added to the next month earnings. We pay you by Check.
3. Online Tuition jobs:
Aim-for-A Tutoring is looking for part-time and full-time tutors to provide online tutoring services to students worldwide. If you have a mastery of ANY subject, we would like to hear from you! You may be located anywhere in the world; if you have a computer with a broadband Internet connection, you can now be part of the revolution iteaching & learning.Method of TutorinThe tutoring will be provided online using audio and data conferencing. Prospective tutors must have access to the following:
A personal computer with a broad band Internet accessComputer speakers and a microphoneInternet telephone software Skype
For some subjects, a digital tablet & pen mouse is highly recommended (details will be provided to you)The tutoring will also use a shared white board software that would be made available for free. Instructions for obtaining and installing the required softwarwill be provided prior to the tutoring session.Tutoring Subjects & GoalsMath, Science, English Language Arts, or any other subjectElementary, Middle, High School, and College Level, as well as general interest topicsHomework help
Preparation for standardized tests such as GED, PSAT, SAT, GRE, GMAT
Required Qualifications for TutorsA minimum of bachelor’s degree in the subjects you would like to teachExcellent verbal and written communication skills in English Langage
Some teaching experience.
4 .Online Computer jobs:
Hundreds of advertising websites are ready to pay you for your online work. Advertisers display their product in the advertising companies’ websites and you will be paid for viewing on these online displays. And some times you need to share your opinion about their products honestly.
The startling fact is that you need not purchase or sell anything from them. Just sit! Just watch! Just answer! And get paid handsomely! Even a child too can do this kind of work. We know that by this time you would have realized that making money through online jobs is as easy as a cakewalk.
Apply Online Jobs Now
Start registering with those online job providing sites. There are around thousand sites providing these types of online money making opportunities. Registration is totally free with them. But selecting the genuine sites to work with will not be an easy task for you at this stage because it needs lot of payment analysis and updating on regular basis.
Keep in mind that there are several scam sites available on the internet in which you will never be paid for your hard work. They are total waste of your time and money. We after doing all those scrutiny for you have a large database of real online jobs which are very genuine in payment and also help you to earn money at home with no investment.
5. Telemarketing Call Centers
We’ve all heard jokes about telemarketers – from the calls during dinner to the “funny” things to say in response – but working in a call center can actually be a viable way to make a living if you enjoy speaking with people and providing good customer service. Today, many business owners feel that it’s important to try to reach new customers by phone. Therefore, finding a job with a telemarketing call center isn’t all that difficult. Working in the field, you’ll find that you experience many of the same issues as employees in other types of call centers.
In general, when you begin working as a telemarketer, you’ll be provided with a list of names, telephone numbers and a pre-written script to follow. Your objective will be to make direct contact with an individual and then try to sell them a product or service. Unfortunately, most people will tell you they aren’t interested in your product or hang up on you as soon as they figure out you are a telemarketer. In most cases, that happens as soon as you tell them your name and the company you represent – if you don’t deal well with rejection, seek another line of work.
Because telemarketers are often viewed as a nuisance, many consumers registered their phone numbers with the Do Not Call Registry. If you’re a telemarketer, you aren’t allowed to call individuals on this list. If you try to call an individual on this list and they determine that you’re a telemarketer, you may have to pay fines and penalties or be subject to investigation by law enforcement authorities.
Even though you may find a number of positions related to telemarketing, it’s not always an easy job. In particular, if you’re offered payment based on sales, you may find that it’ll take a huge amount of calls to attain any kind of success. On the other hand, as a telemarketer, you’ll gain experience dealing with many different kinds of people and communication behaviors, all while earning a healthy income.
6. BPO Jobs:
One of the most important advantages of BPO is the way in which it helps to increase a company’s flexibility. However, several sources have different ways in which they perceive organizational flexibility. Therefore business process outsourcing enhances the flexibility of an organization in different ways.
Most services provided by BPO vendors are offered on a fee-for-service basis. This helps a company becoming more flexible by transforming fixed into variable costs. A variable cost structure helps a company responding to changes in required capacity and does not require a company to invest in assets, thereby making the company more flexible. Outsourcing may provide a firm with increased flexibility in its resource management and may reduce response times to major environmental changes.
.
A third way in which BPO increases organizational flexibility is by increasing the speed of business processes. Using techniques such as linear programming can reduce cycle time and inventory levels, which can increase efficiency and cut costs. Supply chain management with the effective use of supply chain partners and business process outsourcing increases the speed of several business processes, such as the throughput in the case of a manufacturing company.
A company may be able to grow at a faster pace as it will be less constrained by large capital expenditures for people or equipment that may take years to amortize, may become outdated or turn out to be a poor match for the company over time.
Although the above-mentioned arguments favor the view that BPO increases the flexibility of organizations, management needs to be careful with the implementation of it as there are a few stumbling blocks, which could counter these advantages. Among problems, which arise in practice are: A failure to meet service levels, unclear contractual issues, changing requirements and unforeseen charges, and a dependence on the BPO which reduces flexibility. Consequently, these challenges need to be considered before a company decides to engage in business process outsourcing
A further issue is that in many cases there is little that differentiates the BPO providers other than size. They often provide similar services, have similar geographic footprints, leverage similar technology stacks, and have similar Quality Improvement approaches.
7. Data Entry Jobs:
DataEntryPortal.com is the leading online data entry referral service. Through our network of TRUSTED partners, you can be confident that the opportunities are legitimate. Our goal is to provide our members with the most accurate and up-to-date information about online "work from home" data entry companies and help you avoid the scams. We are not a data entry company and do not solicit data entry workers. You will have access to both free and fee based opportunities around the world!
Our members save time and money with the information provided in the Members Only Portal. We have helped thousands of online data entry workers find real data entry opportunities and start making good money working from home
How it Works
Once you become a DataEntryPortal.com member, you will have exclusive access to legitimate data entry opportunities via the Members Only Portal. You will be in control of which companies you want to work with and they will pay you directly via direct Deposit. Earnings are paid weekly and can be sent anywhere in the world. All that is required is a computer and access to the Internet and you can start making a significant income.
We update the Members Only Portal regularly with new information and send email notifications to all members with important updates. This will allow you to continumaking money for as long as you likeAll data is provided!Most forms are 1 - 3 pages and take only minutes to complete!You do not need any special software or hardware!Start earning immediately!
Work whenever and wherever you want. Full time and part time workers are needed worldwide and you can work as much or as little as you like. There are no minimum or maximum restrictions so you can work 1 day per month or every day. The more data you enter, the more money you make!
8. Earn From Home Using Computer:
Working from home with paid surveys is a wonderful way to earn some extra money but there is some basic information that you need to know before you enter this world. The first question on a lot of peoples mind is what are paid surveys? Major companies hire marketing people to find people like you and me
Can you find real work at home? Even though are economy is not in a good place and the current state of affairs in the corporate world is not any better, there are ways to earn era money and that is to find real work at home. Working at Home has become very popular.
Do you have a son or daughter that is in college and needs to earn some extra money? How about having that child earn money working online since kids these days are always on the internet with applications like Face book, Myspace and other social platforms? Most young college students don’t see the huge potential to work
Do you dream of your own money making business that's much better than a real job? Are you serious about earning money from home? Do you really want to work on your own terms and control your own life? We're here to help make your dreams a reality! Real people have used this site for years and continue to be floored by how successful they've become.
9. Online Financial Business:
Aplicor Financial Services CRM Software Edition helps some of the largest commercial and retail banks, investment bankers, asset managers and insurance companies manage and automate front to back office processes so that staff are freed to focus on core activities such as increasing customer share, growing assets under management, creating wealth, ensuring regulatory compliance and expanding customer relationships.
The Financial Services CRM software edition delivers a pre-built industry specific on-demand solution to achieve key competitive advantages:
Make use of a 360 degree holistic customer view at every customer touch point to identify renewal processing, complimentary solutions, financial product promotions, cross-selling and up-selling opportunities, and other value added recommendations
Integrate better customer and household information with line of business financial products and customer portfolios in order to succeed in converting more referrals, growing customer share and increasing customer longevity
Leverage proven selling methodologies, systemic sales processes and a searchable library of best practices to increase sales conversion rates, sales person quota achievement and predictable revenue growth
Financial Services CRM provides at-a-glance viewing of the complete customer relationship along with insight into the customers portfolio, history, household, objectives, risk tolerance and interests. Centralized book of business and customer information enables staff to provide proactive product and policy management, easily create new quotes, automate renewal processes, grow the customer relationship and increase customer share.
Customer knowledge provides the single-greatest path to profitable organic growth in the financial services industry. The Financial Services Edition provides complete front-to-back office client information in a centralized 360 degree customer view. With one integrated system, staff views all client marketing interactions, sales communications, customer support incident history, client contacts and preferences, client documentation, complete purchasing history and outstanding sales and receivables information.
10. Online marketing Jobs:
Job Purpose:
The Online Marketing Specialist will be a member of the E-Commerce Online Marketing team and will work on several aspects of The Body Shop’s online marketing programs in both the US and Canada. This individual will be primarily responsible for the daily management of the company’s affiliate program, Amazon store. On-site blog and social networking efforts. They will also be responsible for tactically supporting a variety of the E-Commerce group’s customer acquisition efforts. They will work with the E-Commerce management team to successfully drive marketing activity and measure results.
Overall
• Manage all daily aspects of performance analysis and reporting for Affiliates, Amazon, on-site blog, social networking and new partnerships.
• Responsible for budget management, accrual process, and spend allocation
• Report performance on online media acquisition/retention efforts including spend, conversions, number of customers acquired and ROI.
• Track and report on campaign results, data analysis, and hold weekly campaign status calls with partners. Develop key learning’s to apply to upcoming campaigns.
• Developing marketing strategies, time lines, and action plans specific to each channel.
• Manage our extensive affiliate network working with Link Share to drive improvedperformance. Work with Online Marketing Manager to develop strategy forrefining program to focus on top affiliates. Develop and execute campaigns targeting affiliates (trial offers, web site specials) to meet monthly sales goals• Understand relationship between affiliate marketing and SEO, PPC activities Stay up-to-date on current trends, developments in affiliate marketing and tactics of competitors.
Skills:
Thorough knowledge and understanding of key metrics and tracking for each assigned channel and ability to define and maintain critical reports.• Experience using Amazon to sell products within a business environment.• Experience with major social media outlets, niche social media environments, popular message boards/forums, social book marking outlets, video portals, knowledge place sites, etc.Outstanding verbal and written communication skills.
• Exceptional attention to detail and ability to effectively multi-task in a deadline driveatmosphere.• Familiarity with e-commerce platforms, email marketing and standard online marketing tools, including Web analytics tools (Core metrics experience a plus). Motivated personality to “own” channel/segment and grow revenues. Superior analytical skills and ability to drive results.Great relationship –builder and adept at working with a team.Creative free thinker who loves being social online.
11. Online sales Assistant:
We are looking for a bright star who has experience in online marketing and who can use their skills to contribute to the growth in online sales. This is a challenging and rewarding role for someone who is commercial, analytical and personable and who wants the opportunity to gain experience in an established, fast growing business.
MarketingYou will work with the Marketing Manager and be involved in SEO activity, PPC campaigns, viral marketing, e-newsletters, and tactical PR and news management, as well as the development and improvement of our site layout and landing page content.You will also get involved in branding and events, CRM and their communications strategy.
Sales:
You will work with the B2B Account Manager and be involved in account setup, processing customer orders, order fulfillment and customer service.Reporting:
You will be required to report, analyze and forecast future online salesEssential Skills/Experience:You will need to have a minimum of 1 years B2B experience in an internet marketing focused role. You will have a working understanding of most of the following: SEO, PPC, Viral Marketing, e-Newsletters, Tactical PR, CRM, Email Marketing etc
Desirable Skills/Experience:
An understanding of how to use Dreamweaver and Photoshop
Able to work on own initiative and able to prioritize workload.
Able to work as part of a team with strong interpersonal skills.
12. Earn From Your Inbox Mails:
If you are trying to finds ways in which you can make some more money to top up your wage, or even save for a special purchase but don’t have the time to take on another job why not consider filling in surveys. Today companies need to know the feelings of the general public when they are looking at new advertising, new products and much more and as a result they pay other companies to conduct surveys on their behalf to find out what people think about certain subjects.
This kind of research is much cheaper than having people stood in town centre asking passers by questions as this is not very cost effective and it can take some time to get enough responses. So how can you get yourself on the lists for these surveys and start to make money from them?
If you want to get paid surveys straight in your inbox and start to earn money it couldn’t be easier. Firstly you need to join as many survey sites as possible and wait for the surveys to start flooding into your inbox. Go online and type “paid surveys” into your search engine and then once you have your search engine results page start to work through the list that you are presented with. You can use different combinations of words to get more surveys sites in your results, you can even type in “get paid surveys” and expect to get a good selection.
When you join a survey site you will be able to find out if they offer financial compensation for completing surveys, or if they offer a points scheme – some sites do this and these points can be exchanged for goods or services or sometimes money tokens. If you are only looking to earn cold hard cash then do not bother signing up for sites that do not offer money as compensation, after all you will only be wasting your time.
As part of the joining process for survey sites you will be asked to fill in a registration form which will ask you certain questions such as your name, age, address, telephone number and certain things about your household – how many people live with you, do you have any children etc? These questions are asked so that the survey company can build a profile of you so you only get paid surveys sent to you that are relevant to you. For instance if a company is looking for male smokers over 40 years of age it is pointless sending a female non smoker of 21 a survey.
Once you have joined survey sites you can site back and wait to get paid surveys sent straight to your inbox. Obviously the more sites that you join the more surveys you will be asked to complete. Check your inbox every day as you will get paid surveys regularly and most surveys have a completion time limit, so don’t miss out. Before you know it you will be completing surveys every day.
13. Online Banking Jobs:
The economy of India is the twelfth largest in the world by market exchange rates and the fourth largest in the world by GDP measured on purchasing power parity (PPP) basis.
The country was under socialist-based policies for an entire generation from the 1950s until the 1980s. The economy was characterized by extensive regulation, protectionism, and public ownership, leading to pervasive corruption and slow growth. Since 1991, continuing economic liberalization has moved the economy towards a market-based system. By 2009, India had prominently established itself as the world's second-fastest growing major economy.
Agriculture is the predominant occupation in India, accounting for about 60% of employment. The service sector makes up a further 28% and industrial sector around 12%. One estimate says that only one in five job-seekers have had any sort of vocational training. The labor force totals half a billion workers. For output, the agricultural sector accounts for 17% of GDP; the service and industrial sectors make up 54% and 29% respectively. Major agricultural products include rice, wheat, oilseed, cotton, jute, tea, sugarcane, potatoes, cattle, water buffalo, sheep, goats, poultry and fish. Major industries include textiles, chemicals, food processing, steel, transportation equipment, cement, mining, petroleum, machinery and software design.
In 2007, India's GDP was $1.237 trillion, which makes it the twelfth-largest economy in the world or fourth largest by purchasing power adjusted exchange rates. India's nominal per capita income of $1043 is ranked 136th in the world. In the late 2000s, India's growth has averaged 7.5% a year, increases which will double the average income within a decade. Unemployment rate is 7% (2008 estimate). Previously a closed economy, India's trade has grown fast.[ India currently accounts for 1.5% of World trade as of 2007 according to the WTO. According to the World Trade Statistics of the WTO in 2006, India's total merchandise trade (counting exports and imports) was valued at $294 billion in 2006 and India's services trade inclusive of export and import was $143 billion. Thus, India's global economic engagement in 2006 covering both merchandise and services trade was of the order of $437 billion, up by a record 72% from a level of $253 billion in 2004. India's trade has reached a still relatively moderate share 24% of GDP in 2006, up from 6% in 1985.
Despite robust economic growth, India continues to face several major problems. The recent economic development has widened the economic inequality across the country. Despite sustained high economic growth rate, approximately 80% of its population lives on less than $2 a day (PPP), more than double the same poverty rate in China. Even though the arrival of Green Revolution brought end to famines in India, 40% of children under the age of three are underweight and a third of all men and women suffer from economic crisis.
14. KPO Online Jobs:
Knowledge process outsourcing (KPO) is a form of outsourcing, in which knowledge-related and information-related work is carried out by workers in a different company or by a subsidiary of the same organization, which may be in the same country or in an offshore location to save cost. Unlike the outsourcing of manufacturing, this typically involves high-value work carried out by highly skilled staff. KPO firms, in addition to providing expertise in the processes themselves, often make many low level business decisions—typically those that are easily undone if they conflict with higher-level business plans.
services. So do more technical trends such as service oriented architecture, enterprise application integration and telework: it is easier to outsource a job if it is already being performed outside the head office. Organizations adopting ISO 9000 and ISO 19011 should also find it much easier to integrate externally provided KPO into their operations and audit them on a fair basis.
As of 2007, most US organizations were hiring foreign professionals under H-1 visas to do jobs in the USA for several years, after which they would return to their home countries as managers to train and supervise others, continuing to report to their former business units.The following extract from chapter two of the British Computer Society book 'Global Services: Moving to a Level Playing Field' by Mark Kobayashi-Hillary and Dr Richard Sykes attempts to define KPO:
"KPO is merely a continuation of BPO, though with rather more business complexity. The defining difference is that KPO is usually focused on knowledge-intensive business processes that require significant domain expertise. The offshore team servicing a KPO contract cannot be easily hired overnight as they will be highly educated and trained, and trusted to take decisions on behalf of the client.
IT outsourcing is strongly focused around technical professionalism, and the migration to business process outsourcing introduces this extra dimension of application professionalism. Ever more complex services, as implied by KPO, demonstrate this very well. The profile of people being hired to serve within KPO service companies are more diverse than just being drawn from technical IT services – these are people with MBAs, and medical, engineering, design or other specialist business skills. KPO delivers higher value to organizations that offshore their domain-based processes, thereby enhancing the traditional cost– quality paradigm of BPO. The central theme of KPO is to create value for the client by providing business expertise rather than process expertise. So KPO involves a shift from standardized processes to advanced analytical thinking, technical skills and decisive judgment based on experience".
15. Business Opportunity:
A business opportunity, or bizopp, involves the sale or lease of any product, service, equipment, etc. that will enable the purchaser-licensee to begin a business. The licensor or seller of a business opportunity usually declares that it will secure or assist the buyer in finding a suitable location or provide the product to the purchaser-licensee. This is different from the sale of an independent business, in which there is no continued relationship required by the seller.
A common type of business opportunity involves a company that sells bulk vending machines and promises to secure suitable locations for the machines. The purchaser is counting on the company to find locations where sales will be high enough to enable him to recoup his expenses and make a profit. Because of the many cases of fraudulent biz-ops in which companies have not followed through on their promises, or in which profits were much less than what the company led the investor to believe, governments closely regulate these operations.
Multi-level marketing is often presented as a business opportunity, such as the phrase, "Let me tell you about an incredible ground-level business opportunity."
In the United States, the Federal Trade Commission receives complaints and helps coordinate enforcement action against fraudulent business opportunities.
A business opportunity consists of four integrated elements all of which are to be present within the same timeframe (window of opportunity) and most often within the same domain or geographical location, before it can be claimed as a business opportunity. These four elements are:
With anyone of the elements missing, a business opportunity may be developed, by finding the missing element. The more unique the combination of the elements, the more unique the business opportunity. The more control an institution (or individual) has over the elements, the better they are positioned to exploit the opportunity and become a niche market leader.
16. Earn By Online Survey:
People can earn money as survey takers online; the problem is often it isn’t very much. Surveys individually are fairly low paying so you’d have to take an awful lot of them and there aren’t always surveys available to take. It is this inconsistency that many people report as being the reason they failed to make a work from home job out of survey taking. However, for experienced business people and those going into business for the first time, an Internet Consultancy franchise with WSI is the ideal option for home based work. The benefits of having a professional team of experts behind the businessperson
It’s not necessary to have any previous technical experience. WSI has become a leading worldwide franchising opportunity by developing an extensive training program that will The reputation of WSI speaks for itself. Entrepreneur has consistently ranked WSI first in the worldwide internet franchising category. With 1500 franchisees operating around the world, we’ve grown and developed a unique business strategy and methods of operation that are designed to help you quickly start a profitable, sustainable business.
WSI has evolved a series of Business Solutions that are developed in one of our 12 production centers located in key, low-cost, yet technologically developed areas around the world. Your role will be to determine which of our solutions is the best fit is for your client by conducting an internet business analysis. In this manner you’ll be able to help businesses reduce costs and become more profitable. By working with small and medium-sized companies in your area you’ll be able to see firsthand the positive results of your successful franchise!
17. Economic Development:
"'Economic development' or 'development' is a term that economists, politicians, and others have used frequently in the 20th century. The concept, however, has been in existence in the West for centuries. Modernization, Westernization, and especially Industrialization are other terms people have used when discussing economic development. Although no one is sure when the concept originated, most people agree that development is closely bound up with the evolution of capitalism and the demise of feudalism."
Among other things, the contemporary social scientific study of economic development encompasses broad theories of the causes of industrial-economic modernization plus organizational and related aspects of enterprise development in modern societies. It embraces sociological-type research relating to business organization and enterprise development from a historical and comparative perspective; specific processes of the evolution (growth, modernization) of markets and management-employee relations; and culturally related cross-national similarities and differences in patterns of industrial organization in contemporary Western societies. On the subject of the nature and causes of the considerable variations that exist in levels of industrial-economic growth and performance internationally, it seeks answers to such questions as: "Why are levels of direct foreign investment and labor productivity significantly higher in some countries than in others?"
Economic Growth vs. Economic Development
Economic development refers to social and technological progress. Economic growth is often assumed to indicate the level of economic development. The term "economic growth" refers to the increase (or growth) of a specific measure such as real national income, gross domestic product, or per capita income. National income or product is commonly expressed in terms of a measure of the aggregate value-added output of the domestic economy called gross domestic product (GDP). When the GDP of a nation rises economists refer to it as economic growth.
The term economic development on the other hand, implies much more. It typically refers to improvements in a variety of indicators such as literacy rates, life expectancy, and poverty rates. GDP is a specific measure of economic welfare that does not take into account important aspects such as leisure time, environmental quality, freedom, or social justice. Economic growth of any specific measure is not a sufficient definition of economic development.
17. Financial Crisis:
Banking crises
When a bank suffers a sudden rush of withdrawals by depositors, this is called a bank run. Since banks lend out most of the cash they receive in deposits (see fractional-reserve banking), it is difficult for them to quickly pay back all deposits if these are suddenly demanded, so a run may leave the bank in bankruptcy, causing many depositors to lose their savings unless they are covered by deposit insurance. A situation in which bank runs are widespread is called a systemic banking crisis or just a banking panic. A situation without widespread bank runs, but in which banks are reluctant to lend, because they worry that they have insufficient funds available, is often called a credit crunch. In this way, the banks become an accelerator of a financial crisis.
Examples of bank runs include the run on the Bank of the United States in 1931 and the run on Northern Rock in 2007. The collapse of Bear Stearns in 2008 has also sometimes been called a bank run, even though Bear Stearns was an investment bank rather than a commercial bank. The U.S. savings and loan crisis of the 1980s led to a credit crunch which is seen as a major factor in the U.S. recession of 1990-91.
Speculative bubbles and crashes
Economists say that a financial asset (stock, for example) exhibits a bubble when its price exceeds the present value of the future income (such as interest or dividends) that would be received by owning it to maturity. If most market participants buy the asset primarily in hopes of selling it later at a higher price, instead of buying it for the income it will generate, this could be evidence that a bubble is present. If there is a bubble, there is also a risk of a crash in asset prices: market participants will go on buying only as long as they expect others to buy, and when many decide to sell the price will fall. However, it is difficult to tell in practice whether an asset's price actually equals its fundamental value, so it is hard to detect bubbles reliably. Some economists insist that bubbles never or almost never occur.
Well-known examples of bubbles (or purported bubbles) and crashes in stock prices and other asset prices include the Dutch tulip mania, the Wall Street Crash of 1929, the Japanese property bubble of the 1980s, the crash of the dot-com bubble in 2000-2001, and the now-deflating United States housing bubble.
18. Mobile Development:
This page lists the known relative differences between the most popular mobile platform development options for handheld devices such as personal digital assistants, enterprise digital assistants or mobile phones. It is not intended to be an absolute guide to the various mobile development platforms; instead it is to help guide developers in choosing a mobile platform for development on Information appliances. More detail on the subject can be found at the Mobile software article.
JAVA ME:
Ideal for a portable solution, if the Java ME platform provides the needed functionality. Good for vertical applications that must be portable. Device-specific libraries exist for many devices and are commonly used for games, making them non-portable. Applications (including their data) cannot be larger than around 1 MB if they are to run on most phones. They must also be cryptographically signed in order to effectively use many APIs such as the file system access API. This is relatively expensive and is rarely done, even for commercial applications.
Symbian:
Very powerful for general purpose development. The Symbian based S60 platform is strongly supported by Nokia with some support from other device manufacturers. In Japan NTT DoCoMo's Symbian based MOAP platform is also well supported by a number of manufacturers (Fujitsu, Sony Ericsson Japan, Mitsubishi and Sharp amongst others). It should be noted, however, that MOAP is not an open development platform. Another Symbian based platform, UIQ, is less well supported (principally by Sony Ericsson and Motorola). Currently large device deployments in Europe and Japan, with little penetration in the US market.
Iphone:
The iPhone and iPod Touch SDK uses Objective C, based on the C programming language. Currently, is only available on Mac OS X 10.5 and is the only way to write an iPhone application. All applications must be cleared by Apple before being hosted on the AppStore, the sole distribution channel for iPhone and iPod touch applications. However, non-Apple approved applications can be released to for jail broken iPhones via Cydia or Installer.
19. Mobile Software:
Mobile software is designed to run on handheld computers, personal digital assistants (PDAs), enterprise digital assistants (EDAs), smart phones and cell phones. Since the first handheld computers of the 1980s, the popularity of these platforms has risen considerably. Recent model cell phones have included the ability to run user-installed software.
Java ME
The dominant mobile software platform is Java (in its incarnation as "Java Platform, Micro Edition", "Java ME", or formerly "J2ME" ). Java ME runs atop a Virtual Machine (called the KVM) which allows reasonable, but not complete, access to the functionality of the underlying phone. The JSR process serves to incrementally increase the functionality that can be made available to Java ME, while also providing Carriers and OEMs the ability to prevent access, or limit access to provisioned software.
This extra layer of software provides a solid barrier of protection which seeks to limit damage from erroneous or malicious software. It also allows Java software to move freely between different types of phone (and other mobile device) containing radically different electronic components, without modification. The price that is paid is a modest decrease in the potential speed of the game and the inability to utilize the entire functionality of a phone (as Java software can only do what this middle-man layer supports.)
Because of these extra security and compatibility, it is usually a quite simple process to write and distribute Java mobile applications (including games) to a wide range of phones. Usually all that is needed is a freely available JDK (Java Development Kit) for creating Java software itself, the accompanying Java ME tools (known as the Java Wireless Toolkit) for packaging and testing mobile software, and space on a web server (web site) to host the resulting application once it is ready for public release. from many sites you can download mobile software.
20. Mobile Marketing:
Mobile marketing can refer to one of two categories of marketing. First, and relatively new, is meant to describe marketing on or with a mobile device, such as a mobile phone (this is an example of horizontal telecommunication convergence). Second, and a more traditional definition, is meant to describe marketing in a moving fashion - for example - technology road shows or moving billboards.
Marketing on a mobile phone has become increasingly popular ever since the rise of SMS (Short Message Service) in the early 2000s in Europe and some parts of Asia when businesses started to collect mobile phone numbers and send off wanted (or unwanted) content.
Mobile marketing via SMS has expanded rapidly in Europe and Asia as a new channel to reach the consumer. SMS initially received negative media coverage in many parts of Europe for being a new form of spam as some advertisers purchased lists and sent unsolicited content to consumer's phones; however, as guidelines are put in place by the mobile operators, SMS has become the most popular branch of the Mobile Marketing industry with several 100 million advertising SMS sent out every month in Europe alone.
In North America the first cross-carrier SMS short code campaign was run by Labatt Brewing Company in 2002. Over the past few years mobile short codes have been increasingly popular as a new channel to communicate to the mobile consumer. Brands have begun to treat the mobile short code as a mobile domain name allowing the consumer to text message the brand at an event, in store and off any traditional media.
SMS services typically run off a short code, but sending text messages to an email address is another methodology. Short codes are 5 or 6 digit numbers that have been assigned by all the mobile operators in a given country for the use of brand campaign and other consumer services. The mobile operators vet every application before provisioning and monitor the service to make sure it does not diverge from its original service description.
Besides short codes, inbound SMS is very often based on long numbers which can be used in place of short codes or premium-rated short messages for SMS reception in several applications, such as product promotions and campaigns. Long numbers are internationally available, as well as enabling businesses to have their own number, rather than short codes which are usually shared across a number of brands. Additionally, long numbers are non-premium inbound numbers.
One key criterion for provisioning is that the consumer opts in to the service. The mobile operators demand a double opt in from the consumer and the ability for the consumer to opt out of the service at any time by sending the word STOP via SMS. These guidelines are established in the MMA Consumer Best Practices Guidelines which are followed by all mobile marketers in the United States.
21. Market Research:
Market research is for discovering what people want, need, or believe. It can also involve discovering how they act. Once that research is completed, it can be used to determine how to market your product.Questionnaires and focus group discussion surveys are some of the instruments for market research.For starting up a business, there are some important things:
Market information
Through Market information you can know the prices of the different commodities in the market, the supply and the demand situation. Information about the markets can be obtained from different sources and varieties and formats.
Market segmentation
Market segmentation is the division of the market or population into subgroups with similar motivations. it is a widely used for segmenting on geographic differences, personality differences, demographic differences, techno graphic differences, use of product differences, and psychographic differences and also gender differences.
Market trends
The upward or downward movements of a market, during a period of time. The market size is more difficult to estimate if you are starting with something completely new. In this case, you will have to derive the figures from the number of potential customers or customer segments.
22. Brand Management:
Brand management is the application of marketing techniques to a specific product, product line, or brand. It seeks to increase the product's perceived value to the customer and thereby increase brand franchise and brand equity. Marketers see a brand as an implied promise that the level of quality people have come to expect from a brand will continue with future purchases of the same product. This may increase sales by making a comparison with competing products more favorable. It may also enable the manufacturer to charge more for the product. The value of the brand is determined by the amount of profit it generates for the manufacturer.
This can result from a combination of increased sales and increased price, and/or reduced COGS (cost of goods sold), and/or reduced or more efficient marketing investment. All of these enhancements may improve the profitability of a brand, and thus, "Brand Managers" often carry line-management accountability for a brand's P&L (Profit and Loss) profitability, in contrast to marketing staff manager roles, which are allocated budgets from above, to manage and execute. In this regard, Brand Management is often viewed in organizations as a broader and more strategic role than Marketing alone.
he annual list of the world’s most valuable brands, published by Interbrand and Business Week, indicates that the market value of companies often consists largely of brand equity. Research by McKinsey & Company, a global consulting firm, in 2000 suggested that strong, well-leveraged brands produce higher returns to shareholders than weaker, narrower brands. Taken together, this means that brands seriously impact shareholder value, which ultimately makes branding a CEO responsibility.
Types of brands
A number of different types of brands are recognized. A "premium brand" typically costs more than other products in the same category. An "economy brand" is a brand targeted to a high price elasticity market segment. A "fighting brand" is a brand created specifically to counter a competitive threat. When a company's name is used as a product brand name, this is referred to as corporate branding. When one brand name is used for several related products, this is referred to as family branding. When all a company's products are given different brand names, this is referred to as individual branding. When a company uses the brand equity associated with an existing brand name to introduce a new product or product line, this is referred to as "brand extension.
" When large retailers buy products in bulk from manufacturers and put their own brand name on them, this is called private branding, store brand, white labelling, private label or own brand (UK). Private brands can be differentiated from "manufacturers' brands" (also referred to as "national brands"). When different brands work together to market their products, this is referred to as "co-branding". When a company sells the rights to use a brand name to another company for use on a non-competing product or in another geographical area, this is referred to as "brand licensing." An "employment brand" is created when a company wants to build awareness with potential candidates. In many cases, such as Google, this brand is an integrated extension of their customer.
Brand Architecture
The different brands owned by a company are related to each other via brand architecture. In "product brand architecture", the company supports many different product brands with each having its own name and style of expression while the company itself remains invisible to consumers. Procter & Gamble, considered by many to have created product branding, is a choice example with its many unrelated consumer brands such as Tide, Pampers, Ivory and Pantene.
With "endorsed brand architecture", a mother brand is tied to product brands, such as The Courtyard Hotels (product brand name) by Marriott (mother brand name). Endorsed brands benefit from the standing of their mother brand and thus save a company some marketing expense by virtue promoting all the linked brands whenever the mother brand is advertised.
The third model of brand architecture is most commonly referred to as "corporate branding". The mother brand is used and all products carry this name and all advertising speaks with the same voice. A good example of this brand architecture is the UK-based conglomerate Virgin. Virgin brands all its businesses with its name (e.g., Virgin Mega store, Virgin Atlantic, Virgin Brides) and uses one style and logo to support each of them.
23. Marketing Dominance:
Market dominance is a measure of the strength of a brand, product, service, or firm, relative to competitive offerings. There is often a geographic element to the competitive landscape. In defining market dominance, you must see to what extent a product, brand, or firm controls a product category in a given geographic area.
Calculating
There are several ways of calculating market dominance. The most direct is market share. This is the percentage of the total market serviced by a firm or brand. A declining scale of market shares is common in most industries: that is, if the industry leader has say 50% share, the next largest might have 25% share, the next 12% share, the next 6% share, and all remaining firms combined might have 7% share.
Market share is not a perfect proxy of market dominance. The influences of customers, suppliers, competitors in related industries, and government regulations must be taken into account. Although there are no hard and fast rules governing the relationship between market share and market dominance, the following are general criteria:
A company, brand, product, or service that has a combined market share exceeding 60% most probably has market power and market dominance.
A market share of over 35% but less than 60%, held by one brand, product or service, is an indicator of market strength but not necessarily dominance.
A market share of less than 35%, held by one brand, product or service, is not an indicator of strength or dominance and will not raise anti-combines concerns of government regulators.
Market shares within an industry might not exhibit a declining scale. There could be only two firms in a duopolistic market, each with 50% share; or there could be three firms in the industry each with 33% share; or 100 firms each with 1% share. The concentration ratio of an industry is used as an indicator of the relative size of leading firms in relation to the industry as a whole. One commonly used concentration ratio is the four-firm concentration ratio, which consists of the combined market share of the four largest firms, as a percentage, in the total industry. The higher the concentration ratio, the greater the market power of the leading firms.
24. Digital Marketing:
Digital Marketing is the promoting of brands using the Internet, mobile and other interactive channels.
Digital Marketing is the practice of promoting products and services using digital distribution channels to reach consumers in a timely, relevant, personal and cost-effective manner.
Whilst digital marketing does include many of the techniques and practices contained within the category of Internet Marketing, it extends beyond this by including other channels with which to reach people that do not require the use of The Internet. As a result of this non-reliance on the Internet, the field of digital marketing includes a whole host of elements such as mobile phones, sms/mms, display / banner ads and digital outdoor.
Previously seen as a stand-alone service in its own right, it is frequently being seen as a domain that can and does cover most, if not all, of the more traditional marketing areas such as Direct Marketing by providing the same method of communicating with an audience but in a digital fashion.
Pull digital marketing technologies involve the user having to seek out and directly grab (or pull) the content via web searches. Web site/blogs and streaming media (audio and video) are good examples of this. In each of these examples, users have a specific link (URL) to view the content.Pros:
Push digital marketing technologies involve both the marketer (creator of the message) as well as the recipients (the user). Email, SMS, RSS are examples of push digital marketing. In each of these examples, the marketer has to send (push) the messages to the users (subscribers) in order for the message to be received.
Pros:
Can be personalized -- messages received can be highly targeted and specific to selected criteria – like a special offer for females, 21 years old or over and living in California.
Detailed tracking and reporting – marketers can see not only how many people saw their message but also specific information about each user such as their name as well as demographic and psychographic data.
High Return on Investment (ROI) possible – if executed the right way, push messaging can help drive new revenue as well as brand reinforcement.
Cons:
Compliance issue – each push messaging technology has its own set of regulations, from minor (RSS) to heavily controlled (email and text messaging)
Requires mechanism to deliver content – the marketer has to use an application to send the message, from an email marketing system to RSS feeders.
Delivery can be blocked – if the marketer does not follow the regulations set forth by each push message type, the content can be refused or rejected before getting to the intended recipient.
25. Advertising:
Advertising is a form of communication that typically attempts to persuade potential customers to purchase or to consume more of a particular brand of product or service. “While now central to the contemporary global economy and the reproduction of global production networks, it is only quite recently that advertising has been more than a marginal influence on patterns of sales and production. The formation of modern advertising was intimately bound up with the emergence of new forms of monopoly capitalism around the end of the 19th and beginning of the 20th century as one element in corporate strategies to create, organize and where possible control markets, especially for mass produced consumer goods. Mass production necessitated mass consumption, and
Organizations that frequently spend large sums of money on advertising that sells what is not, strictly speaking, a product or service include political parties, interest groups, religious organizations, and military recruiters. Non-profit organizations are not typical advertising clients, and may rely on free modes of persuasion, such as public service announcements.Money spent on advertising has increased dramatically in recent years. In 2007, spending on advertising has been estimated at over $150 billion in the United States and $385 billion worldwide, and the latter to exceed $450 billion by 2010.
While advertising can be seen as necessary for economic growth, it is not without social costs. Unsolicited Commercial Email and other forms of spam have become so prevalent as to have become a major nuisance to users of these services, as well as being a financial burden on internet service providers. Advertising is increasingly invading public spaces, such as schools, which some critics argue is a form of child exploitation.
26. Technical Accounting:
Bookkeeping is the recording of financial transactions. Transactions include sales, purchases, income, and payments by an individual or organization. Bookkeeping is usually performed by a bookkeeper. Bookkeeping should not be confused with accounting. The accounting process is usually performed by an accountant. The accountant creates reports from the recorded financial transactions recorded by the bookkeeper. There are some common methods of bookkeeping such as the Single-entry bookkeeping system and the Double-entry bookkeeping system. But while these systems may be seen as "real" bookkeeping, any process that involves the recording of financial transactions is a bookkeeping process.
A bookkeeper (or book-keeper), also known as an accounting clerk or accounting technician, is a person that records the day-to-day financial transactions of an org. or job. A bookkeeper is usually responsible for writing the "daybooks." The daybooks consist of purchase, sales, receipts, and payments. The bookkeeper is responsible for ensuring all transactions are recorded in the correct daybook, supplier’s ledger, customer ledger, and general ledger. The bookkeeper brings the books to the trial balance stage. An accountant may prepare the income statement and balance sheet using the trial balance and ledgers prepared by the bookkeeper.
Bookkeeping systems
Two common bookkeeping systems used by businesses and other organizations are the single-entry bookkeeping system and the double-entry bookkeeping system. Single-entry bookkeeping uses only income and expense accounts, recorded primarily in a revenue and expense journal. Single-entry bookkeeping is adequate for many small businesses. Double-entry bookkeeping requires posting (recording) each transaction twice, using debits and credits.
27. Internet Marketing:
Internet marketing, also referred to as i-marketing, web marketing, online marketing, or eMarketing, is the marketing of products or services over the Internet.
The Internet has brought many unique benefits to marketing, one of which being lower costs for the distribution of information and media to a global audience. The interactive nature of Internet marketing, both in terms of providing instant response and eliciting responses, is a unique quality of the medium. Internet marketing is sometimes considered to have a broader scope because it not only refers to digital media such as the Internet, e-mail, and wireless media; however, Internet marketing also includes management of digital customer data and electronic customer relationship management (ECRM) systems.
Internet marketing ties together creative and technical aspects of the Internet, including design, development, advertising, and sale.
Internet marketing also refers to the placement of media along different stages of the customer engagement cycle through search engine marketing (SEM), search engine optimization (SEO), banner ads on specific websites, e-mail marketing, and Web 2.0 strategies. In 2008 The New York Times working with COM Score published an initial estimate to quantify the user data collected by large Internet-based companies. Counting four types of interactions with company websites in addition to the hits from advertisements served from advertising networks, the authors found the potential for collecting data upward of 2,500 times on average per user per month.
One-to-one approach
The targeted user is typically browsing the Internet alone, so the marketing messages can reach them personally. This approach is used in search marketing, where the advertisements are based on search engine keywords entered by the user. And now with the advent of Web 2.0 tools, many users can interconnect as "peers."
28. Mobile Computing:
Mobile computing is a generic term describing one's ability to use technology while moving, as opposed to portable computers, which are only practical for use while deployed in a stationary configuration.
Many commercial and government field forces deploy a ruggedized portable computer such as the Panasonic Tough book or larger rack-mounted computers with their fleet of vehicles. This requires the units to be anchored to the vehicle for driver safety, device security, and user ergonomics. Ruggedized computers are rated for severe vibration associated with large service vehicles and off-road driving, and the harsh environmental conditions of constant professional use such as in EMS, fire and public safety.
Other elements that enables the unit to function in vehicle:
Operating Temperature: A vehicle cabin can often experience temperature swings from -20F to +140F. Computers typically must be able to withstand these temperatures while operating. Typical fan based cooling has stated limits of 95F-100F of ambient temperature, and temperature below freezing require localized heaters to bring components up to operating temperature(based on independent studies by the SRI Group and by Panasonic R&D).
Vibration: Vehicles typically have considerable vibration that can decrease life expectancy of computer components, notably rotational storage such as HDDs.
Daylight or sunlight readability: Visibility of standard screens becomes an issue in bright sunlight.Touch screens: These enable users to easily interact with the units in the field without removing gloves.High-Temperature Battery Settings:. Lithium Ion batteries are sensitive to high temperature conditions for charging. A computer designed for the mobile environment should be designed with a high-temperature charging function that limits the charge to 85% or less of capacity.External wireless Connections, and External GPS Antenna Connections: Necessary to contend with the typical metal cabins of vehicles and their impact on wireless reception, and to take advantage of much more capable external tranception equipment.
Several specialized manufacturers such as National Products Inc (Ram Mounts), Gamber Johnson and Led Co build mounts for vehicle mounting of computer equipment for specific vehicles. The mounts are built to withstand the harsh conditions and maintain ergonomics.
Specialized installation companies, such as Touch Star Pacific, specialize in designing the mount design, assembling the proper parts, and installing them in a safe and consistent manner away from airbags, vehicle HVAC controls, and driver controls. Frequently installations will include a WWAN modem, power conditioning equipment, and WWAN/WLAN/GPS/etc… transceiver antenna mounted external to the vehicle.
29. Email Marketing:
E-mail marketing is a form of direct marketing which uses electronic mail as a means of communicating commercial or fundraising messages to an audience. In its broadest sense, every e-mail sent to a potential or current customer could be considered e-mail marketing. However, the term is usually used to refer to:Sending e-mails with the purpose of enhancing the relationship of a merchant with its current or previous customers and to encourage customer loyalty and repeat business.
Many companies use e-mail marketing to communicate with existing customers, but many other companies send unsolicited bulk e-mail, also known as spam.
Internet system administrators have always considered themselves responsible for dealing with "abuse of the net", but not "abuse on the net". That is, they will act quite vigorously against spam, but will leave issues such as libel or trademark infringement to the legal system. Most administrators possess a passionate dislike for spam, which they define as any unsolicited e-mail. Draconian measures—such as taking down a corporate website, with or without warning—are entirely normal responses to spamming. Typically, the terms of service in Internet companies' contracts permit such actions; therefore, the spammer often has no recourse.
Illicit e-mail marketing predates legitimate e-mail marketing. On the early Internet (i.e., Arpanet), it was not permitted to use the medium for commercial purposes. As a result, marketers attempting to establish themselves as legitimate businesses in e-mail marketing have had an uphill battle, hampered also by criminal spam operations billing themselves as legitimate ones.
It is frequently difficult for observers to distinguish between legitimate and spam e-mail marketing. First, spammers attempt to represent themselves as legitimate operators. Second, direct-marketing political groups such as the United States Direct Marketing Association (DMA) have pressured legislatures to legalize activities that some Internet operators consider to be spamming, such as the sending of "opt-out" unsolicited commercial e-mail. Third, the sheer volume of spam has led some users to mistake legitimate commercial e-mail for spam. This situation arises when a user receives e-mail from a mailing list to which he/she subscribes. Additional confusion arises when both legitimate and spam messages have a similar appearance, as when messages include HTML and graphics.
One effective technique used by established email marketing companies is to require what is known as the "double opt-in" method of requiring a potential recipient to manually confirm their request for information by clicking a unique link and entering a unique code identifier to confirm that the owner of the recipient email address has indeed requested the information. Responsible e-mail marketing and auto responder companies use this double opt-in method to confirm each request before any information is sent out.
A report issued by the e-mail services company Return Path, as of mid-2008 e-mail deliverability is still an issue for legitimate marketers. According to the report, legitimate e-mail servers averaged a delivery rate of 56%; twenty percent of the messages were rejected, and eight percent were filtered.
Due to the volume of spam e-mail on the Internet, spam filters are essential to most users. Some marketers report that legitimate commercial e-mail messages frequently get caught and hidden by filters; however, it is somewhat less common for e-mail users to complain that spam filters block legitimate mail.
Companies considering the use of an e-mail marketing program must make sure that their program does not violate spam laws such as the United States' Controlling the Assault of Non-Solicited Pornography and Marketing Act (CAN-SPAM), the European Privacy and Electronic Communications Regulations 2003, or their Internet service provider's acceptable use policy. Even if a company adheres to the applicable laws, it can be blacklisted (e.g., on SPEWS) if Internet e-mail administrators determine that the company is sending spam.
30. Social Media Optimization:
Social media optimization (SMO) is a set of methods for generating publicity through social media, online communities and community websites. Methods of SMO include adding RSS feeds, social news buttons, blogging, and incorporating third-party community functionalities like images and videos. Social media optimization is related to search engine marketing, but differs in several ways, primarily the focus on driving traffic from sources other than search engines, though improved search ranking is also a benefit of successful SMO.
Social media optimization is in many ways connected as a technique to viral marketing where word of mouth is created not through friends or family but through the use of networking in social bookmaking, video and photo sharing websites. In a similar way the engagement with blogs achieves the same by sharing content through the use of RSS in the blogosphere and special blog search engines.
Social Media optimization is considered an integral part of an online reputation management (ORM) or Search Engine Reputation Management (SERM) strategy for organizations or individuals who care about their online presence.
Template: Business Strategy Social Media Optimisation (SMO) is not limited to marketing and brand building. Increasingly smart businesses are integrating social media participation as part of their knowledge management strategy (i.e. product/service development, recruiting, employee engagement and turnover, brand building, customer satisfaction and relations, business development and more).
31. Revenue Sharing:
Revenue sharing has multiple, related meanings depending on context.
In business, revenue sharing refers to the sharing of profits and losses among different groups. One form shares between the general partner(s) and limited partners in a limited partnership. Another form shares with a company's employees, and another between companies in a business alliance.
On the Internet, revenue sharing is also known as cost per sale, and accounts for about 80% of affiliate compensation programs. E-commerce web site operators using revenue sharing pay affiliates a certain percentage of sales revenues (usually excluding tax, shipping and other 3rd party cost that the customer pays) generated by customers whom the affiliate refer via various advertising methods.
Another form of online revenue sharing consists in people working together and registering online in a way similar to that of a corporation, and sharing the proceeds.
United States government revenue sharing was in place from 1972-1987. Under this policy, Congress gave an annual amount of federal tax revenue to the states and their cities, counties and townships. Revenue sharing was extremely popular with state officials, but it lost federal support during the Reagan Administration. In 1987, revenue sharing was replaced with block grants in smaller amounts to reduce the federal deficit.
32. Mobile advertising:
Mobile advertising is a form of advertising via mobile (wireless) phones or other mobile devices. It is a subset of mobile marketing.
Some see mobile advertising as closely related to online or internet advertising, though its reach is far greater - currently, most mobile advertising is targeted at mobile phones, that came estimably to a global total of 3 billion as of 2007, and will reach 4 billion in 2008. Notably computers, including desktops and laptops, are currently estimated at 800 million globally.
It is probable that advertisers and media industry will increasingly take account of a bigger and fast-growing mobile market, though it remains at around 1% of global advertising spend. Mobile media is evolving rapidly and while mobile phones will continue to be the mainstay, it is not clear whether mobile phones based on cellular backhaul or smartphones based on WiFi hot spot or WiMAX hot zone will also strengthen. However, such is the emergence of this form of advertising, that there is now a dedicated global awards ceremony organized every year by Visiongain.
As mobile phones outnumber TV sets by over 2 to 1, and internet users by nearly 3 to 1, and the total laptop and desktop PC population by over 4 to 1, advertisers in many markets have recently rushed to this media. In Spain 75% of mobile phone owners receive ads, in France 62% and in Japan 54%. More remarkably as mobile advertising matures, like in the most advanced markets, the user involvement also matures. In Japan today, already 44% of mobile phone owners click on ads they receive on their phones.
Types of mobile ads
In some markets, this type of advertising is most commonly seen as a Mobile Web Banner (top of page) or Mobile Web Poster (bottom of page banner), while in others, it is dominated by SMS advertising (which has been estimated at over 90% of mobile marketing revenue worldwide). Other forms include MMS advertising, advertising within mobile games and mobile videos, during mobile TV receipt, full-screen interstices, which appear while a requested item of mobile content or mobile web page is loading up, and audio advertisements that can take the form of a jingle before a voicemail recording, or an audio recording played while interacting with a telephone-based service such as movie ticketing or directory assistance.
The Mobile Marketing Association has published mobile advertising guidelines, but it is difficult to keep such guidelines current in such a fast-developing area.
The effectiveness of a mobile ad campaign can be measured in a variety of ways. The main measurements are impressions (views) and click-through rates. Additional measurements include conversion rates, such as, click-to-call rates and other degrees of interactive measurement.
33. Search engine optimization:
Search engine optimization (SEO) is the process of improving the volume or quality of traffic to a web site from search engines via "natural" ("organic" or "algorithmic") search results. Typically, the earlier a site appears in the search results list, the more visitors it will receive from the search engine. SEO may target different kinds of search, including image search, local search, and industry-specific vertical search engines. This gives a web site web presence.
As an Internet marketing strategy, SEO considers how search engines work and what people search for. Optimizing a website primarily involves editing its content and HTML coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines.
The acronym "SEO" can also refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site, SEO tactics may be incorporated into web site development and design. The term "search engine friendly" may be used to describe web site designs, menus, content management systems and shopping carts that are easy to optimize.
Another class of techniques, known as black hat SEO or Spamdexing, use methods such as link farms and keyword stuffing that degrade both the relevance of search results and the user-experience of search engines. Search engines look for sites that employ these techniques in order to remove them from their indices.
34. Web analytics:
Web analytics is the measurement, collection, analysis and reporting of internet data for purposes of understanding and optimizing web usage.There are two categories of web analytics; off-site and on-site web analytics.Off-site web analytics refers to web measurement and analysis irrespective of whether you own or maintain a website. It includes the measurement of a website's potential audience (opportunity), share of voice (visibility), and buzz (comments) that is happening on the Internet as a whole.
On-site web analytics measure a visitor's journey once on your website. This includes its drivers and conversions; for example, which landing pages encourage people to make a purchase. On-site web analytics measures the performance of
Web servers record some of their transactions in a logfile. It was soon realized that these logfiles could be read by a program to provide data on the popularity of the website. Thus arose web log analysis software.
In the early 1990s, web site statistics consisted primarily of counting the number of client requests (or hits) made to the web server. This was a reasonable method initially, since each web site often consisted of a single HTML file. However, with the introduction of images in HTML, and web sites that spanned multiple HTML files, this count became less useful. The first true commercial Log Analyzer was released by IPRO in 1994.
Two units of measure were introduced in the mid 1990s to gauge more accurately the amount of human activity on web servers. These were page views and visits (or sessions). A page view was defined as a request made to the web server for a page, as opposed to a graphic, while a visit was defined as a sequence of requests from a uniquely identified client that expired after a certain amount of inactivity, usually 30 minutes. The page views and visits are still commonly displayed metrics, but are now considered rather unsophisticated measurements.
The emergence of search engine spiders and robots in the late 1990s, along with web proxies and dynamically assigned IP addresses for large companies and ISPs, made it more difficult to identify unique human visitors to a website. Log analyzers responded by tracking visits by cookies, and by ignoring requests from known spiders.
The extensive use of web caches also presented a problem for logfile analysis. If a person revisits a page, the second request will often be retrieved from the browser's cache, and so no request will be received by the web server. This means that the person's path through the site is lost. Caching can be defeated by configuring the web server, but this can result in degraded performance for the visitor to the website.
35. Marketing strategy:
A marketing strategy is a process that can allow an organization to concentrate its limited resources on the greatest opportunities to increase sales and achieve a sustainable competitive advantage. A marketing strategy should be centered on the key concept that customer satisfaction is the main goal.
Types of strategies:
Marketing strategies may differ depending on the unique situation of the individual business. However there are a number of ways of categorizing some generic strategies. A brief description of the most common categorizing schemes is presented below:
Strategies based on market dominance - In this scheme, firms are classified based on their market share or dominance of an industry. Typically there are three types of market dominance strategies:
Porter generic strategies - strategy on the dimensions of strategic scope and strategic strength. Strategic scope refers to the market penetration while strategic strength refers to the firm’s sustainable competitive advantage. The generic strategy framework (porter 1984) comprises two alternatives each with two alternative scopes. These are Differentiation and low-cost leadership each with a dimension of Focus-broad or narrow.
Product differentiation
Market segmentation
Innovation strategies - This deals with the firm's rate of the new product development and business model innovation. It asks whether the company is on the cutting edge of technology and business innovation. There are three types:PioneerClose followerLate followeGrowth strategies - In this scheme we ask the question, “How should the firm grow?” There are a number of different ways of answering that question, but the most common gives four answers:Horizontal integrationVertical integrationDiversification
Intensification
There are a many companies especially those in the Consumer Package Goods (CPG) market that adopt the theory of running their business centered on Consumer, Shopper & Retailer needs. Their Marketing departments spend quality time looking for "Growth Opportunities" in their categories by identifying relevant insights (both mindsets and behaviors) on their target Consumers, Shoppers and retail partners. These Growth Opportunities emerge from changes in market trends; segment dynamics changing and also internal brand or operational business challenges. The Marketing team can then prioritize these Growth Opportunities and begin to develop strategies to exploit the opportunities that could include new or adapted products, services as well as changes to the 7Ps.
36. Marketing management:
Marketing management is a business discipline which is focused on the practical application of marketing techniques and the management of a firm's marketing resources and activities. Marketing managers are often responsible for influencing the level, timing, and composition of customer demand accepted definition of the term. In part, this is because the role of a marketing manager can vary significantly based on a business' size, corporate culture, and industry context. For example, in a large consumer products company, the marketing manager may act as the overall general manager of his or her assigned product
From this perspective, the scope of marketing management is quite broad. The implication of such a definition is that any activity or resource the firm uses to acquire customers and manage the company's relationships with them is within the purview of marketing management. Additionally, the Kotler and Keller definition encompasses both the development of new products and services and their delivery to customers.
Marketing expert Regis McKenna expressed a similar viewpoint in his influential 1991 Harvard Business Review article "Marketing is everything." McKenna argued that because marketing management encompasses all factors that influence a company's ability to deliver value to customers; it must be "all-pervasive, part of everyone's job description, from the receptionists to the Board of Directors."
This view is also consistent with the perspective of management guru Peter Drucker, who wrote: "Because the purpose of business is to create a customer, the business enterprise has two--and only these two--basic functions: marketing and innovation. Marketing and innovation produce results; all the rest are costs. Marketing is the distinguishing, unique function of the business."
37. System Software:
System software is closely related to, but distinct from operating system. It is a computer software that provides the infrastructure over operating system which programs can operate, i.e. it allows application programs to perform different task on the computer. Operating systems, such as Microsoft Windows NT, Mac OS X or Linux, builds a virtual machine for the system software what they can use.
System software is software that basically allows the user to work. Without the system software the computer cannot do complex tasks. In contrast to system software, software that allows you to do things like create text documents, play games, listen to music, or surf the web is called application software.
In general application programs are software that enables the end-user to perform specific, productive tasks, such as word processing or image manipulation. System software performs tasks like transferring data from memory to disk, or rendering text onto a display device.
Types of system software
System software can be classified as operating system. An operating system creates an interface between other software and the system hardware, while system software will refine or allow greater interaction with the user.
System software helps use the operating system and computer system. It includes diagnostic tools, compilers, servers, windowing systems, utilities, language translator, data communication programs, data management programs and more. The purpose of systems software is to insulate the applications programmer as much as possible from the details of the particular computer complex being used, especially memory and other hardware features, and such accessory devices as communications, printers, readers, displays, keyboards, etc.
38. Programming Software:
A programming tool or software development tool is a program or application that software developers use to create, debug, maintain, or otherwise support other programs and applications. The term usually refers to relatively simple programs that can be combined together to accomplish a task, much as one might use multiple hand tools to fix a physical object.
The history of software tools began with the first computers in the early 1950s that used linkers, loaders, and control programs. Tools became famous with UNIX in the early 1970s with tools like grep, awk and make that were meant to be combined flexibly with pipes. The term "software tools" came from the book of the same name by Brian Kernighan and P. J. Plauger.
Tools were originally simple and light weight. As some tools have been maintained, they have been integrated into more powerful integrated development environments (IDEs). These environments consolidate functionality into one place, sometimes increasing simplicity and productivity, other times sacrificing flexibility and extensibility. The workflow of IDEs is routinely contrasted with alternative approaches, such as the use of UNIX shell tools with text editors like Vim and Emacs.
The distinction between tools and applications is murky. For example, developers use simple databases (such as a file containing a list of important values) all the time as tools. However a full-blown database is usually thought of as an application in its own right.
For many years, computer-assisted software engineering (CASE) tools were sought after. Successful tools have proven elusive. In one sense, CASE tools emphasized design and architecture support, such as for UML. But the most successful of these tools are IDEs.
The ability to use a variety of tools productively is one hallmark of a skilled software engineer.Nju Nju we are normal we are normal.
39. Object-oriented programming:
Object-oriented programming (OOP) is a programming paradigm that uses "objects" — data structures consisting of data fields and methods — and their interactions to design applications and computer programs. Programming techniques may include features such as information hiding, data abstraction, encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP.
Object-oriented programming has roots that can be traced to the 1960s. As hardware and software became increasingly complex, quality was often compromised. Researchers studied ways to maintain software quality and developed object-oriented programming in part to address common problems by strongly emphasizing discrete, reusable units of programming logic. The methodology focuses on data rather than processes, with programs composed of self-sufficient modules (objects) each containing all the information needed to manipulate its own data structure.
This is in contrast to the existing modular programming which had been dominant for many years that focused on the function of a module, rather than specifically the data, but equally provided for code reuse, and self-sufficient reusable units of programming logic, enabling collaboration through the use of linked modules (subroutines). This more conventional approach, which still persists, tends to consider data and behavior separately.
An object-oriented program may thus be viewed as a collection of cooperating objects, as opposed to the conventional model, in which a program is seen as a list of tasks (subroutines) to perform. In OOP, each object is capable of receiving messages, processing data, and sending messages to other objects and can be viewed as an independent 'machine' with a distinct role or responsibility. The actions (or "operators") on these objects are closely associated with the object. For example, the data structures tend to carry their own operators around with them (or at least "inherit" them from a similar object or class).
The Simula programming language was the first to introduce the concepts underlying object-oriented programming (objects, classes, subclasses, virtual methods, coroutines, and discrete event simulation) as a superset of Algol. Simula also used automatic garbage collection which had been invented earlier for the functional programming language Lisp. Simula was used for physical modeling, such as models to study and improve the movement of ships and their content through cargo ports. Smalltalk was the first programming language to be called "object-oriented".
40. Automotive design;
Automotive design is the profession involved in the development of the appearance, and to some extent the ergonomics, of motor vehicles or more specifically road vehicles. This most commonly refers to automobiles but also refers to motorcycles, trucks, buses, coaches, and vans. The functional design and development of a modern motor vehicle is typically done by a large team from many different disciplines included in automotive engineers. Automotive design in this context is primarily concerned with developing the visual appearance or aesthetics of the vehicle, though it is also involved in the creation of the product concept. Automotive design is practiced by designers who usually have an art background and a degree in industrial design or transportation design.
Design elements
The aesthetic value will need to correspond to ergonomic functionality and utility features as well. In particular, vehicular electronic components and parts will give more challenges to automotive designers who are required to update on the latest information and knowledge associated with emerging vehicular gadgetry, particularly dashtop mobile devices, like GPS navigation, satellite radio, HD radio, mobile TV, MP3 players, video playback and smartphone interfaces. Though not all the new vehicular gadgets are to be designated as factory standard items, but some of them may be integral to determining the future course of any specific vehicular models.
Exterior design (styling):
The stylist responsible for the design of the exterior of the vehicle develops the proportions, shape, and surfaces of the vehicle. Exterior design is first done by a series of digital or manual drawings. Progressively more detailed drawings are executed and approved. Clay (industrial plasticine) and or digital models are developed from, and along with the drawings. The data from these models are then used to create a full sized mock-up of the final design (body in white). With 3 and 5 axis CNC Milling Machines, the clay model is first designed in a computer program and then "carved" using the machine and large amounts of clay. Even in times of high-class 3d software and virtual models on powerwalls the clay model is still the most important tool to evaluate the design of a car and therefore used throughout the industry.
Interior design (styling)
The stylist responsible for the design of the vehicle interior develops the proportions, shape, placement, and surfaces for the instrument panel, seats, door trim panels, headliner, pillar trims, etc. Here the emphasis is on ergonomics and the comfort of the passengers. The procedure here is the same as with exterior design (sketch, digital model and clay model).
41. Air Filter:
A particulate air filter is a device composed of fibrous materials which removes solid particulates such as dust, pollen, mold, and bacteria from the air. A chemical air filter consists of a sorbent or catalyst for the removal of airborne molecular contaminants such as volatile organic compounds or ozone. Air filters are used in applications where air quality is important, notably in building ventilation systems and in engines.
Some buildings, as well as aircraft and other man-made environments (e.g., satellites and space shuttles) use foam, pleated paper, or spun fiberglass filter elements. Another method uses fibers or elements with a static electric charge, which attract dust particles. The air intakes of internal combustion engines and compressors tend to use paper, foam, or cotton filters. Oil bath filters have fallen out of favor. The technology of air intake filters of gas turbines has improved significantly in recent years, due to improvements in the aerodynamics and fluid-dynamics of the air-compressor part of the Gas Turbines.
Automotive cabin air filter\
The cabin air filter is typically a pleated-paper filter that is placed in the outside-air intake for the vehicle's passenger compartment. Some of these filters are rectangular and similar in shape to the combustion air filter. Others are uniquely shaped to fit the available space of particular vehicles' outside-air intakes. Being a relatively recent addition to automobile equipment, this filter is often overlooked. Clogged or dirty cabin air filters can significantly reduce airflow from the cabin vents, as well as introduce allergens into the cabin air stream.
Internal combustion air filters
The combustion air filter prevents abrasive particulate matter from entering the engine's cylinders, where it would cause mechanical wear and oil contamination.Most fuel injected vehicles use a pleated paper filter element in the form of a flat panel. This filter is usually placed inside a plastic box connected to the throttle body with an intake tube.
Older vehicles that use carburetors or throttle body fuel injection typically use a cylindrical air filter, usually a few inches high and between 6 and 16 inches in diameter. This is positioned above the carburetor or throttle body, usually in a metal or plastic container which may incorporate ducting to provide cool and/or warm inlet air, and secured with a metal or plastic lid.
42. Interior design:
Interior design is a profession concerned with anything that is found inside a space - walls, windows, doors, finishes, textures, light, furnishings and furniture. All of these elements are used by interior designers to develop a functional, safe, and aesthetically pleasing space for a building's user.
The work of an interior designer draws upon many disciplines including environmental psychology, architecture, product design, and traditional decoration (aesthetics and cosmetics). They plan the spaces of almost every type of building including: hotels, corporate spaces, schools, hospitals, private residences, shopping malls, restaurants, theaters, and airport terminals. Today, interior designers must be attuned to architectural detailing including floor plans, home renovations, and construction codes. Some interior designers are architects as well.
Specializations
In jurisdictions where the profession is regulated by the government, designers must meet broad qualifications and show competency in the entire scope of the profession, not only in a specialty. Designers may elect to obtain specialist certification offered by private organizations. In the United States, interior designers who also possess environmental expertise in design solutions for sustainable construction can receive accreditation in this area by taking the Leadership in Energy and Environmental Design (LEED) examination.
The specialty areas that involve interior designers are limited only by the imagination and are continually growing and changing. With the increase in the aging population, an increased focus has been placed on developing solutions to improve the living environment of the elderly population, which takes into account health and accessibility issues that can affect the design. Awareness of the ability of interior spaces to create positive changes in people's lives is increasing, so interior design is also becoming relevant to this type of advocacy.
Earnings:
Interior design earnings vary based on employer, number of years with experience, and the reputation of the individual. For residential projects, self-employed interior designers usually earn a per-minute fee plus a percentage of the total cost of furniture, lighting, artwork, and other design elements. For commercial projects, they may charge per-hour fees, or a flat fee for the whole project. The median annual earning for wage and salary interior designers, in the year 2006, was $42,260. The middle 50% earned between $31,830 and $57,230. The lowest 10 percent earned less than $24,270, and the highest 10 percent earned more than $78,760.While median earnings are an important indicator of average salaries, it is essential to look at additional key factors in a discussion about revenue generated from design services. Location, demographic of client base and scope of work all affect the potential earnings of a designer. With regard to location, central metropolitan areas where costs of living expenses and median earnings are generally greater, so is the potential for higher earnings for the interior designers and decorators in these locations. Indeed, urban areas attract a greater population of potential clients thereby creating a greater demand for design services.
Additionally, as the average square footage of homes and offices has increased over time, the scope of work performed translates directly to higher earnings. Scope refers to the overall size and detail of a project - materials, furnishings, paint, fabrics and architectural embellishments utilized are all examples of scope. As stated above, earnings for interior designers and decorators may include a margin charged to the client as a percentage of the total cost of certain furniture and fixtures used in the scope of work. Hence, as scope increases, so do earnings.
43. Fashion design:
Fashion design is the applied art dedicated to clothing and lifestyle accessories created within the cultural and social influences of a specific time.
It differs from costume design which is considered to have a built in obsolescence usually of one to two seasons. A season is defined as either autumn/winter or spring/summer.
Structure
Fashion designers can work in a number of ways. Fashion designers may work full-time for one Fashion Company, known as in-house designers, which owns the designs. They may work alone or as part of a team. Freelance designer’s works for themselves, and sell their designs to fashion houses, directly to shops, or to clothing manufacturers. The garments bear the buyer's label. Some fashion designers set up their own labels, under which their designs are marketed.
Some fashion designers are self-employed and design for individual clients. Other high-fashion designers cater to specialty stores or high-fashion department stores. These designers create original garments, as well as those that follow established fashion trends. Most fashion designers, however, work for apparel manufacturers, creating designs of men’s, women’s, and children’s fashions for the mass market. Large designer brands which have a 'name' as their brand such as Calvin Klein, Ralph Lauren, or Chanel are likely to be designed by a team of individual designers under the direction of a designer director.
Designing a collection
A fashion collection is something that designer’s puts together each season to show their idea of new trends in both their high end couture range as well as their mass market range.
Fashion designers must take numerous matters into account when designing clothes for a collection, including consistency of theme and style. They will also take into account views of existing customers, previous fashions and styles of competitors, and anticipated fashion trends, as well as the season for the collection.
Designing a garment
Fashion designers work in different ways. Some sketch their ideas on paper, while others drape fabric on a dress stand. When a designer is completely satisfied with the fit of the toile (or muslin), he or she will consult a professional pattern maker who then makes the finished, working version of the pattern out of card. The pattern maker's job is very precise and painstaking. The fit of the finished garment depends on their accuracy. Finally, a sample garment is made up in the proper fabric and tested on a fitting model.
44. Aerospace:
Aerospace comprises the atmosphere of Earth and surrounding space. Typically the term is used to refer to the industry that researches, designs, manufactures, operates, and maintains vehicles moving through air and space. Aerospace is a very diverse field, with a multitude of commercial, industrial and military applications. Aerospace is not the same as airspace, which is a term used to describe the physical air space directly above a location on the ground.
Aerospace manufacturing:
Aerospace manufacturing is a high technology industry that produces "aircraft, guided missiles, space vehicles, aircraft engines, propulsion units, and related parts," according to the Bureau of Labor Statistics of the United States Web site. Most of the industry is geared toward governmental work. For each Original Equipment Manufacturer (OEM), the US government has assigned a CAGE code. These codes help to identify each manufacturer, repair facilities, and other critical after market vendors in the aerospace industry.
In the European Union, aerospace companies such as EADS, BAE Systems, Thales, Dassault, Saab and Finmeccanica account for a large share of the global aerospace industry and research effort, with the European Space Agency as one of the largest consumers of aerospace technology and products.
In People's Republic of China, Beijing, Xian, Chengdu, Shanghai, Shenyang and Nanchang are major research and manufacture centers of aerospace industry. China has developed extensive capability to design, test and produce military aircraft, missiles and space vehicles. However, despite the experimental model of Y-10, which was abolished in 1984, China is still developing its civil aerospace industry.
In India, Bangalore is a major centre of aerospace industry, being the place where Hindustan Aeronautics Limited, the National Aerospace Laboratories and Indian Space Research Organization are headquartered. The Indian Space Research Organization (ISRO) is undertaking a project to send an orbiter to moon, due mid 2008. This project has been titled Chandrayaan (Moon Craft).
In Russia, large aerospace companies like Oboronprom and the United Aircraft Building Corporation are among the major global players in this industry. The historic Soviet Union was also the home of a very major aerospace industry.
The United Kingdom formerly attempted to maintain its own large aerospace industry, making its own airliners, warplanes, etc., but it has largely turned its lot over to cooperative efforts with continental companies, and it has turned into a large import customer, too, from countries like the United States. However, the UK has a very active aerospace sector supplying components, sub-assemblies and sub-systems to other manufacturers, both in Europe and all over the world, including the United States.
In the United States of America, the Department of Defense and the National Aeronautics and Space Administration (NASA) are the two largest consumers of aerospace technology and products. Others include the very large airline industry. The U.S. Bureau of Labor Statistics of the United States reported that the aerospace industry employed 444,000 waged and salaried jobs in 2004. Most of those jobs were in Washington State and in California, with Missouri and Texas also important. The leading aerospace manufacturers in the U.S. and in the world are Boeing, United Technologies Corporation, and the Lockheed Martin Corp..
Important locations of the civilian aerospace industry worldwide include Washington State (Boeing), California (Boeing, Lockheed Martin, etc.); Montreal, Canada, (Bombardier, Pratt & Whitney Canada); Toulouse, France, (Airbus/EADS); and Hamburg, Germany, (Airbus/EADS); as well as Sao José dos Campos, where the Brazilian Embraer company is based. Some sources place Boeing in Chicago, but that is merely an office space location, and not an industrial location. Boeing really makes its large civil airplanes on the West Coast of the United States.
45. Laptop:
A laptop (also known as a notebook) is a personal computer designed for mobile use small enough to sit on one's lap.[1] A laptop includes most of the typical components of a desktop computer, including a display, a keyboard, a pointing device (a touchpad, also known as a track pad, and/or an Accupoint II), speakers, as well as a battery, into a single small and light unit. The rechargeable battery required is charged from an AC/DC adapter and typically stores enough energy to run the laptop for two to three hours in its initial state, depending on the configuration and power management of the computer.
Laptops are usually shaped like a large notebook with thicknesses between 0.7–1.5 inches (18–38 mm) and dimensions ranging from 10x8 inches (27x22cm, 13" display) to 15x11 inches (39x28cm, 17" display) and up. Modern laptops weigh 3 to 12 pounds (1.4 to 5.4 kg); older laptops were usually heavier. Most laptops are designed in the flip form factor to protect the screen and the keyboard when closed. Modern 'tablet' laptops have a complex joint between the keyboard housing and the display, permitting the display panel to twist and then lay flat on the keyboard housing. They usually have a touch screen display and some include handwriting recognition or graphics drawing capability.
Laptops were originally considered to be "a small niche market"[2] and were thought suitable mostly for "specialized field applications" such as "the military, the Internal Revenue Service, accountants and sales representatives".[2][3] Battery-powered portable computers had just 2% worldwide market share in 1986[4]. But today, there are already more laptops than desktops in businesses, and laptops are becoming obligatory for student use and more popular for general use.[5] In 2008 more laptops than desktops were sold in the US[6] and according to a forecast by the research firm IDC and Intel, the same milestone will be achieved in the worldwide PC market as soon as 2009.
46. Desktop alert:
Desktop alerting is a relatively young sector of Information Logistics, where information is sent to an application that displays its content directly on the user's computer desktop. Typically, the alert is delivered to a client computer from a central provider. The client is either an opt-in software product, for example an RSS reader, or an Enterprise-Class software product in a LAN. As a workplace tool, desktop alerting can serve as tool for reliable information distribution. Desktop alerting has become popular among administrative, corporate and academic organizations.
Conceptual
A desktop alert is an electronic message sent to computers over a local network or over the internet. Techniques used to deliver the message vary, but the end result is usually the appearance of a message on the end-user desktop. Each recipient's computer typically has a client application installed and running during the entire user session. Targeting recipients can be accomplished by LDAP/Active Directory integration in a corporate environment or by subscriptions for private users.
From a commercial or corporate point of view, desktop alerting offers some advantages over email and telephone communication. In many cases, the client application can be configured to start automatically with each user session and often is subject to administrative rights only. This offers the advantage over corporate email solutions, that the recipient doesn't have to take any action in order to be able to receive messages. Compared to telephony, a larger number of recipients can be informed in less time, because messages can be sent instantly to hundreds or thousands of users. These benefits have promoted the use of commercial solutions in companies that need reliable methods of one to one and one to many communications, including in emergency management.
Fields of application
Desktop alerting has several fields of application. Commercial applications can be used to deliver important corporate information, for example important notifications from a Service Desk to users in a corporate network or advertisement to consumers. Also very popular are RSS feeds that deliver information on the recipient's personal interest, e.g. sports or economic news. Another field of application is the delivery of information in Emergency Management.
47. DTS (sound system):
DTS (also known as Digital Theater System(s)), owned by DTS, Inc. (NASDAQ: DTSI), is a multi-channel digital surround sound format used for both commercial/theatrical and consumer grade applications. It is used for in-movie sound both on film and on DVD, on CD and during the last few years of the Laserdisc format's existence, several releases had DTS soundtracks
One of the company's initial investors was film director Steven Spielberg, who felt that theatrical sound formats up until the company's founding were no longer state of the art, and as a result were no longer optimal for use on projects where quality sound reproduction was of the utmost importance. Work on the format started in 1991, four years after Dolby Labs started work on its new codec, Dolby Digital. The basic and most common version of the format is a 5.1 channel system, similar to a Dolby Digital setup, which encodes the audio as five primary (full-range) channels plus a special LFE (low-frequency effect) channel, for the subwoofer.Note however that encoders and decoders support numerous channel combinations and stereo, four-channel and four-channel+LFE soundtracks have been released commercially on DVD, CD and Laserdisc.
Other newer DTS variants are also currently available, including versions that support up to seven primary audio channels plus one LFE channel (DTS-ES). DTS's main competitors in multichannel theatrical audio are Dolby Digital and SDDS, although only Dolby Digital and DTS are used on DVDs and implemented in home theater hardware. Spielberg debuted the format with his 1993 production of Jurassic Park, which came slightly less than a full year after the official theatrical debut of Dolby Digital (Batman Returns). In addition, Jurassic Park also became the first home video release to contain DTS sound when it was released on LaserDisc in January 1997, two years after the first Dolby Digital home video release (Clear and Present Danger on Laserdisc) which debuted in January 1995.
In theatrical use, information in the form of a modified time code is optically imaged onto the film. An optical LED reader reads the time code data off the film and sends it to the DTS processor which uses this time code to synchronize the projected image with the soundtrack audio. The actual audio is recorded in compressed form on standard CD-ROM media at a bit rate of 1,103 kbit/s. The processor also acts as a transport mechanism, as it holds and reads the audio discs. Newer units can generally hold three discs, allowing a single processor/transport to handle two-disc film soundtracks along with a third disc containing sound for theatrical trailers. In addition, specific elements of the imprinted time code allow identifying data to be embedded within the code, ensuring that a certain film's soundtrack will only run with that film.
48. Keyboard (computing):
In computing, a keyboard is an input device, partially modeled after the typewriter keyboard, which uses an arrangement of buttons or keys, which act as mechanical levers or electronic switches. A keyboard typically has characters engraved or printed on the keys and each press of a key typically corresponds to a single written symbol. However, to produce some symbols requires pressing and holding several keys simultaneously or in sequence. While most keyboard keys produce letters, numbers or signs (characters), other keys or simultaneous key presses can produce actions or computer commands.
In normal usage, the keyboard is used to type text and numbers into a word processor, text editor or other program. In a modern computer, the interpretation of key presses is generally left to the software. A computer keyboard distinguishes each physical key from every other and reports all key presses to the controlling software. Keyboards are also used for computer gaming, either with regular keyboards or by using keyboards with special gaming features, which can expedite frequently used keystroke combinations. A keyboard is also used to give commands to the operating system of a computer, such as Windows' Control-Alt-Delete combination, which brings up a task window or shuts down the machine.
Alphabetical, numeric, and punctuation keys are used in the same fashion as a typewriter keyboard to enter their respective symbol into a word processing program, text editor, data spreadsheet, or other program. Many of these keys will produce different symbols when modifier keys or shift keys are pressed. The alphabetic characters become uppercase when the shift key or Caps Lock key is depressed. The numeric characters become symbols or punctuation marks when the shift key is depressed. The alphabetical, numeric, and punctuation keys can also have other functions when they are pressed at the same time as some modifier keys.
The Space bar is a horizontal bar in the lowermost row, which is significantly wider than other keys. Like the alphanumeric characters, it is also descended from the mechanical typewriter. Its main purpose is to enter the space between words during typing. It is large enough so that a thumb from either hand can use it easily. Depending on the operating system, when the space bar is used with a modifier key such as the control key, it may have functions such as resizing or closing the current window, half-spacing, or backspacing. In computer games and other applications the key has myriad uses in addition to its normal purpose in typing, such as jumping and adding marks to check boxes. In certain programs for playback of digital video, the space bar is used for pausing and resuming the playback.
49. USB flash drive:
A USB flash drive consists of a NAND-type flash memory data storage device integrated with a USB (Universal Serial Bus) interface. USB flash drives are typically removable and rewritable, much smaller than a floppy disk, and most USB flash drives weigh less than an ounce (30 g). Storage capacities typically range from 64 MB to 128 GB with steady improvements in size and price per capacity. Some allow 1 million write or erase cycles and have 10-year data retention, connected by USB 1.1 or USB 2.0.
USB flash drives offer potential advantages over other portable storage devices, particularly the floppy disk. They have a more compact shape, operate faster, hold much more data, have a more durable design, and operate more reliably due to their lack of moving parts. Additionally, it has become increasingly common for computers to be sold without floppy disk drives. USB ports, on the other hand, appear on almost every current mainstream PC and laptop. These types of drives use the USB mass storage standard, supported natively by modern operating systems such as Windows, Mac OS X, Linux, and other Unix-like systems. USB drives with USB 2.0 support can also operate faster than an optical disc drive, while storing a larger amount of data in a much smaller space.
Nothing actually moves in a flash drive: the term drive persists because computers read and write flash-drive data using the same system commands as for a mechanical disk drive, with the storage appearing to the computer operating system and user interface as just another drive.
A flash drive consists of a small printed circuit board protected inside a plastic, metal, or rubberized case, robust enough for carrying with no additional protection—in a pocket or on a key chain, for example. The USB connector is protected by a removable cap or by retracting into the body of the drive, although it is not likely to be damaged if exposed (but it may damage other items, for example a bag it is placed in). Most flash drives use a standard type-A USB connection allowing plugging into a port on a personal computer, but drives for other interfaces also exist.
50. Bluetooth:
Bluetooth is an open wireless protocol for exchanging data over short distances from fixed and mobile devices, creating personal area networks (PANs). It was originally conceived as a wireless alternative to RS232 data cables. It can connect several devices, overcoming problems of synchronization.Bluetooth uses a radio technology called frequency-hopping spread spectrum, which chops up the data being sent and transmits chunks of it on up to 79 frequencies. In its basic mode, the modulation is Gaussian frequency-shift keying (GFSK).
It can achieve a gross data rate of 1 Mb/s. Bluetooth provides a way to connect and exchange information between devices such as mobile phones, telephones, laptops, personal computers, printers, Global Positioning System (GPS) receivers, digital cameras, and video game consoles through a secure, globally unlicensed Industrial, Scientific and Medical (ISM) 2.4 GHz short-range radio frequency bandwidth. The Bluetooth specifications are developed and licensed by the Bluetooth Special Interest Group (SIG). The Bluetooth SIG consists of companies in the areas of telecommunication, computing, networking, and consumer electronics.
List of applications
More prevalent applications of Bluetooth include:Wireless control of and communication between a mobile phone and a hands-freeeadset. This was one of the earliest applications to become popular.Wireless networking between PCs in a confined space and where little bandwidth is required.Wireless communication with PC input and output devices, the most common being the mouse, keyboard and printerTransfer of files, contact details, calendar appointments, and reminders between devices with OBEX.Replacement of traditional wired serial communications in test equipment, GPS receivers, medical equipment, bar code scanners, and traffic control devices.
For controls where infrared was traditionally used.For low bandwidth applications where higher [USB] bandwidth is not required and cable-free connection desired.
Sending small advertisements from Bluetooth-enabled advertising hoardings to other, discoverable, Bluetooth devices[citation needed].Two seventh-generation game consoles, Nintendo's Wii[6] and Sony's PlayStation 3, use Bluetooth for their respective wireless controllersDial-up internet access on personal computers or PDAs using a data-capable mobile phone as a modem.
51. WI-FI:
Wi-Fi is a trademark of the Wi-Fi Alliance for certified products based on the IEEE 802.11 standards. This certification warrants interoperability between different wireless devices. The term Wi-Fi[1][2] is often used by the public as a synonym for wireless LAN (WLAN); but not every wireless LAN product has a Wi-Fi certification, which may be because of certification costs that must be paid for each certified device type.Wi-Fi is supported by most personal computer operating systems, many game consoles, laptops, smart phones, printers, and other peripherals.
Uses:
.
In addition to restricted use in homes and offices, Wi-Fi can make access publicly available at Wi-Fi hotspots provided either free of charge or to subscribers to various providers. Organizations and businesses such as airports, hotels and restaurants often provide free hotspots to attract or assist clients. Enthusiasts or authorities who wish to provide services or even to promote business in a given area sometimes provide free Wi-Fi access. There are already more than 300 metropolitan-wide Wi-Fi (Muni-Fi) projects When wireless networking technology first entered the market many problems ensued for consumers who could not rely on products from different vendors working together. The Wi-Fi Alliance began as a community to solve this issue — aiming to address the needs of the end-user and to allow the technology to mature. The Alliance created the branding Wi-Fi CERTIFIED to reassure consumers that products will interoperate with other products displaying the same branding.
Many consumer devices use Wi-Fi. Amongst others, personal computers can network to each other and connect to the Internet, mobile computers can connect to the Internet from any Wi-Fi hotspot, and digital cameras can transfer images wirelessly.
Routers which incorporate a DSL-modem or a cable-modem and a Wi-Fi access point, often set up in homes and other premises, provide Internet-access and internetworking to all devices connected (wirelessly or by cable) to them. One can also connect Wi-Fi devices in ad-hoc mode for client-to-client connections without a router. Wi-Fi also enables places which would traditionally not have network to be connected, for example bathrooms, kitchens and garden sheds.
As of 2007 Wi-Fi technology had spread widely within business and industrial sites. In business environments, just like other environments, increasing the number of Wi-Fi access-points provides redundancy, support for fast roaming and increased overall network-capacity by using more channels or by defining smaller cells. Wi-Fi enables wireless voice-applications (VoWLAN or WVOIP). Over the years, Wi-Fi implementations have moved toward "thin" access-points, with more of the network intelligence housed in a centralized network appliance, relegating individual access-points to the role of mere "dumb" radios. Outdoor applications may utilize true mesh topologies. As of 2007 Wi-Fi installations can provide a secure computer networking gateway, firewall, DHCP server, intrusion detection system, and other functions.
52. DRE voting machine:
A direct-recording electronic (DRE) voting machine records votes by means of a ballot display provided with mechanical or electro-optical components that can be activated by the voter (typically buttons or a touchscreen); that processes data by means of a computer program; and that records voting data and ballot images in memory components. After the election it produces a tabulation of the voting data stored in a removable memory component and as printed copy. The system may also provide a means for transmitting individual ballots or vote totals to a central location for consolidating and reporting results from precincts at the central location.
Benefits of DRE voting machines
A Hart eSlate DRE voting machine with jelly buttons for people with manual dexterity disabilities.
Like all voting machines DRE systems increase the speed of vote counting. They can also incorporate the most broad assistive technologies for the largest classes of handicapped people, allowing them to vote without forfeiting the anonymity of their vote. These machines can use headphones and other adaptive technology to provide the necessary accessibility. DRE's can also provide the most robust form of immediate feedback to the voter detecting such possible problems as under voting and over voting which may result in a spoiled ballot. This immediate feedback can be helpful in successfully determining voter intent.
Additionally, with DRE voting systems there is no risk of exhausting the supply of paper ballots, and remove the need for printing of paper ballots, a significant cost. When administering elections in which ballots are offered in multiple languages (in some areas of the United States, public elections are required to by the National Voting Rights Act of 1965), DRE voting systems can be programmed to provide ballots in multiple languages on a single machine. For example, King County, Washington's demographics require them under U.S. federal election law to provide ballot access in Chinese. With any type of paper ballot, the county has to decide how many Chinese-language ballots to print, how many to make available at each polling place, etc. Any strategy that can assure that Chinese-language ballots will be available at all polling places is certain, at the very least, to result in a lot of wasted ballots.
53. Touch screen:
A touch screen is a display that can detect the presence and location of a touch within the display area. The term generally refers to touch or contact to the display of the device by a finger or hand. Touch screens can also sense other passive objects, such as a stylus. However, if the object sensed is active, as with a light pen, the term touchscreen is generally not applicable. The ability to interact directly with a display typically indicates the presence of a touch screen.The touch screen has two main attributes. First, it enables one to interact with what is displayed directly on the screen, where it is displayed, rather than indirectly with a mouse or touchpad. Secondly, it lets one do so without requiring any intermediate device, again,
such as a stylus that needs to be held in the hand. Such displays can be attached to computers or, as terminals, to networks. They also play a prominent role in the design of digital appliances such as the personal digital assistant (PDA), satellite navigation devices, mobile phones, and video games.Building touch screensThere are several principal ways to build a touch screen. The key goals are to recognize one or more fingers touching a display, to interpret the command that this represents, and to communicate the command to the appropriate application. In the most popular techniques called the capacitive or resistive approach, manufactures coat the screen with a thin, transparent metallic layer. When a user touches the surface, the system records the change in the electrical current that flows through the display.
Dispersive-signal technology which 3M created in 2002, measures the piezoelectric effect- the voltage generated when mechanical force is applied to a material- that occurs chemically when a strengthened glass substrate is touched. There are two infrared-based approaches. In one, any array of sensors detects finger touching or almost touching the display, there by interrupting light beams projected over the screen. In the other, bottom-mounted infrared cameras record screen touches. In each case, the system determines the intended command based on the controls showing on the screen at the time and the location of the touch.
54. Optical imaging:
Optical imaging is an imaging technique. Optics usually describes the behavior of visible, ultraviolet, and infrared light used in imaging.Diffusive optical imaging in neuroscienceDiffusive optical imaging is a technique that gives cognitive neuroscientists the ability to simultaneously obtain information about the source of neural activity as well as its time course. In other words, it allows them to "see" neural activity and study the functioning of the brain. In this method, a near-infrared laser is positioned on the scalp. Detectors composed of optical fiber bundles are located a few centimeters away from the light source. These detectors sense how the path of light is altered, either through absorption or scattering, as it traverses brain tissue.
This method can provide two types of information. First, it can be used to measure the absorption of light, which is related to concentration of chemicals in the brain. Second, it can measure the scattering of light, which is related to physiological characteristics such as the swelling of glia and neurons that are associated with neuronal firing.
Typical applications include rapid 2D optical topographic imaging of the Event Related Optical Signal (EROS) or near infrared spectroscopy (NIRS) signal following brain activity and topographic reconstruction of an entire 3D volume of tissue to diagnose breast cancer or neonatal brain hemorrhage. The spatial resolution of DOT techniques is several millimeters, comparable to the lower end of functional magnetic resonance imaging (fMRI).
The temporal resolution of EROS is very good, comparable to electroencephalography, and magneto encephalography (~milliseconds), while that of NIRS, which measures homodynamic changes rather than neuronal activity, is comparable to fMRI (~seconds). DOT instruments are relatively low cost ($150,000), portable and immune to electrical interference. The signal-to-noise ratio of NIRS is quite good, enabling detection of responses to single events in many cases. EROS signals are much weaker, typically requiring averaging of many responses. Important chemicals that this method can detect include hemoglobin and cytochromes.
55. Liquid crystal display:
A liquid crystal display (LCD) is an electronically-modulated optical device shaped into a thin, flat panel made up of any number of color or monochrome pixels filled with liquid crystals and arrayed in front of a light source (backlight) or reflector. It is often utilized in battery-powered electronic devices because it uses very small amounts of electric power.
A comprehensive classification of the various types and electro-optical modes of LCDs is provided in the article LCD classification.
Each pixel of an LCD typically consists of a layer of molecules aligned between two transparent electrodes, and two polarizing filters, the axes of transmission of which are (in most of the cases) perpendicular to each other. With no actual liquid crystal between the polarizing filters, light passing through the first filter would be blocked by the second (crossed) polarizer.
The surfaces of the electrodes that are in contact with the liquid crystal material are treated so as to align the liquid crystal molecules in a particular direction. This treatment typically consists of a thin polymer layer that is unidirectional rubbed using, for example, a cloth. The direction of the liquid crystal alignment is then defined by the direction of rubbing. Electrodes are made of a transparent conductor called Indium Tin Oxide (ITO).
Before applying an electric field, the orientation of the liquid crystal molecules is determined by the alignment at the surfaces. In a twisted nematic device (still the most common liquid crystal device), the surface alignment directions at the two electrodes are perpendicular to each other, and so the molecules arrange themselves in a helical structure, or twist. This reduces the rotation of the polarization of the incident light, and the device appears grey. If the applied voltage is large enough, the liquid crystal molecules in the center of the layer are almost completely untwisted and the polarization of the incident light is not rotated as it passes through the liquid crystal layer. This light will then be mainly polarized perpendicular to the second filter, and thus be blocked and the pixel will appear black. By controlling the voltage applied across the liquid crystal layer in each pixel, light can be allowed to pass through in varying amounts thus constituting different levels of gray.
56. Facial recognition system:
A facial recognition system is a computer application for automatically identifying or verifying a person from a digital image or a video frame from a video source. One of the ways to do this is by comparing selected facial features from the image and a facial database.
Techniques:
Traditional
Some facial recognition algorithms identify faces by extracting landmarks, or features, from an image of the subject's face. For example, an algorithm may analyze the relative position, size, and/or shape of the eyes, nose, cheekbones, and jaw. These features are then used to search for other images with matching features. Other algorithms normalize a gallery of face images and then compress the face data, only saving the data in the image that is useful for face detection. A probe image is then compared with the face data. One of the earliest, successful systems is based on template matching techniques applied to a set of salient facial features, providing a sort of compressed face representation. Popular recognition algorithms include eigenface, fisherface, the Hidden Markov model, and the neuronal motivated dynamic link matching.
3-D
A newly emerging trend, claimed to achieve previously unseen accuracies, is three-dimensional face recognition. This technique uses 3-D sensors to capture information about the shape of a face. This information is then used to identify distinctive features on the surface of a face, such as the contour of the eye sockets, nose, and chin one advantage of 3-D facial recognition is that it is not affected by changes in lighting like other techniques. It can also identify a face from a range of viewing angles, including a profile view.
57. Speech recognition:
Speech recognition (also known as automatic speech recognition or computer speech recognition) converts spoken words to machine-readable input (for example, to key presses, using the binary code for a string of character codes). The term "voice recognition" is sometimes incorrectly used to refer to speech recognition, when actually referring to speaker recognition, which attempts to identify the person speaking, as opposed to what is being said. Confusingly, journalists and manufacturers of devices that use speech recognition for control commonly use the term Voice Recognition when they mean Speech Recognition.
In the health care domain, even in the wake of improving speech recognition technologies, medical transcriptionists (MTs) have not yet become obsolete. Many experts in the field anticipate that with increased use of speech recognition technology, the services provided may be redistributed rather than replaced.
Speech recognition can be implemented in front-end or back-end of the medical documentation process. Front-End SR is where the provider dictates into a speech-recognition engine, the recognized words are displayed right after they are spoken, and the dictator is responsible for editing and signing off on the document. It never goes through an MT/editor.
Back-End SR or Deferred SR is where the provider dictates into a digital dictation system, and the voice is routed through a speech-recognition machine and the recognized draft document is routed along with the original voice file to the MT/editor, who edits the draft and finalizes the report. Deferred SR is being widely used in the industry currently.
Many Electronic Medical Records (EMR) applications can be more effective and may be performed more easily when deployed in conjunction with a speech-recognition engine. Searches, queries, and form filling may all be faster to perform by voice than by using a keyboard.
58. Eye (cyclone):
The eye is a region of mostly calm weather found at the center of strong tropical cyclones. The eye of a storm is a roughly circular area and typically 30–65 km (20–40 miles) in diameter. It is surrounded by the eyewall, a ring of towering thunderstorms where the most severe weather of a cyclone occurs. The cyclone's lowest barometric pressure occurs in the eye, and can be as much as 15% lower than the atmospheric pressure outside the storm.
In strong tropical cyclones, the eye is characterized by light winds and clear skies, surrounded on all sides by a towering, symmetric eyewall. In weaker tropical cyclones, the eye is less well-defined, and can be covered by the central dense overcast, which is an area of high, thick clouds which show up brightly on satellite imagery. Weaker or disorganized storms may also feature an eyewall which does not completely encircle the eye, or have an eye which features heavy rain. In all storms, however, the eye is the location of the storm's minimum barometric pressure: the area where the atmospheric pressure at sea level is the lowest.
Formation and detection:
Tropical cyclones typically form from large, disorganized areas of disturbed weather in tropical regions. As more thunderstorms form and gather, the storm develops rainbands which start rotating around a common center. As the storm gains strength, a ring of stronger convection forms at a certain distance from the rotational center of the developing storm. Since stronger thunderstorms and heavier rain mark areas of stronger updrafts, the barometric pressure at the surface begins to drop, and air begins to build up in the upper levels of the cyclone. This results in the formation of an upper level anticyclone, or an area of high atmospheric pressure above the central dense overcast. Consequentially, most of this built up air flows outward anticyclonically above the tropical cyclone. Outside the forming eye, the anticyclone at the upper levels of the atmosphere enhances the flow towards the center of the cyclone, pushing air towards the eyewall and causing a positive feedback loop.
However, a small portion of the built-up air, instead of flowing outward, flows inward towards the center of the storm. This causes air pressure to build even further, to the point where the weight of the air counteracts the strength of the updrafts in the center of the storm. Air begins to descend in the center of the storm, creating a mostly rain-free area; a newly formed eye.
59. Wireless headsets
Wireless headsets are quickly becoming a new trend for both business and consumer communications. There are a number of solutions for wireless, and they usually differ according to application and power-managementDECT wireless headseDECT (Digital Enhanced Cordless Telecommunication] is one the most common standards for cordless telephones. It uses 1.88 to 1.90 GHz RF (European Version) or 1.92 to 1.93 GHz RF (US Version), as the frequency bandwidth. Different countries have regulations for the bandwidth used in DECT, but most have pre-set this band for wireless audio transmission. The most common profile of DECT is GAP
(Generic Access Profile), which is used to ensure common communication between base station and its cordless handset. This common platform allows communication between the two devices even if they are from different manufacturers. For example, a Panasonic DECT base-station theoretically can connect to a Siemens DECT Handset. Based on this profile, developers such as Plantronics or Jabra have launched wireless headsets which can directly pair with any GAP-enable DECT telephones. So users with a DECT Wireless Headset can pair it with their home DECT phones and enjoy wireless communication.
Bluetooth wireless headsets:
Most users have heard about Bluetooth, and although this technology was designed originally for a much wider application, it has today become largely for voice transmission (a notable exception to this would be the use of Bluetooth in the Nintendo Wiimote). The reason for this general exclusivity is because of the power/range settings of Bluetooth. Bluetooth uses 2.4 GHz RF, similar to WLAN or Wi-Fi; however, by default it is set for a very close proximity usage for power consumption benefits. This deficiency for a longer-range coverage made Bluetooth technology un-desirable for data transmission. As nowadays, more and more mobile phones come equipped with Bluetooth, this technology has become a common wireless profile for wireless mobile phone headsets only.
When choosing a Bluetooth headset users should be aware that Bluetooth headsets come in different types as well. Standard Bluetooth headset's using version 1.0 or 1.1 are often a single-side monaural earpiece, which can only access the Headset/hands free profile of Bluetooth. Depending on the phone's operating system, this type of headset will either play music at a very low quality (because the phone is converting it into a voice signal) or will be unable to play music at all (because the phone cannot perform such a conversion). Users who need a stereo-music playing Bluetooth headset should look for a headset with the A2DP profile.[10] Users should note that some A2DP-equipped headsets will automatically de-activate the microphone function during music-listening, so if these headsets are paired to a computer via Bluetooth connection, the headset may either disable the stereo function or the microphone function.
60. IPod
IPods with color displays use anti-aliased graphics and text, with sliding animations. All iPods (except the iPod shuffle and iPod touch) have five buttons and the later generations have the buttons integrated into the click wheel—an innovation that gives an uncluttered, minimalist interface. The buttons perform basic functions such as menu, play, pause, next track, and previous track. Other operations such as scrolling through menu items and controlling the volume are performed by using the click wheel in a rotational manner. The iPod shuffle does not have any controls on the actual player, instead it has a small control on the earphones cable, with volume up and down buttons and a single button for play/pause, next track, etc. The iPod Touch has no click-wheel. Instead it uses a 3.5" touch screen in addition to a home button, sleep/wake button and (on the second generation iPod touch) volume up and down buttons. The user interface for the iPod touch is virtually identical to the iPhone. Both devices use the iPhone OS.
Connectivity
Four iPod wall chargers, with FireWire (left) and USB (right three) connectors, which allow iPods to charge without a computer. Notice how the units have been miniaturized.
Originally, a FireWire connection to the host computer was used to update songs or recharge the battery. The battery could also be charged with a power adapter that was included with the first four generations. The third generation began including a 30-pin dock connector, allowing for FireWire or USB connectivity. This provided better compatibility with non-Apple machines, as most of them did not have FireWire ports at the time. Eventually
Apple began shipping iPods with USB cables instead of FireWire, although the latter was available separately. As of the first generation iPod Nano and the fifth generation iPod Classic, Apple discontinued using FireWire for data transfer (while still allowing for use of FireWire to charge the device) in an attempt to reduce cost and form factor. As of the second-generation iPod Touch and the fourth-generation iPod Nano, FireWire charging ability has been removed. The second and third generation iPod Shuffle uses a single 3.5 mm jack which acts as both a headphone jack and a data port for the dock.
The dock connector also allowed the iPod to connect to accessories, which often supplement the iPod's music, video, and photo playback. Apple sells a few accessories, such as the now-discontinued iPod Hi-Fi, but most are manufactured by third parties such as Belkin and Griffin. Some peripherals use their own interface, while others use the iPod's own screen. Because the dock connector is a proprietary interface, the implementation of the interface requires paying royalties to Apple.
61. Cassette-based walkman
The original blue-and-silver Walkman model TPS-L2 went on sale in Japan on July 1, 1979. In the UK, it came with stereo playback and mini headphone jacks, permitting two people to listen at the same time (though it came with only one pair of MDR-3L2 headphones). Where the Pressman had the recording button, the Walkman had a "hotline" button which activated a small built-in microphone (the Pressman), partially overriding the sound from the cassette, and allowing one user to talk to the other over the music. The dual jacks and "hotline" button were phased out in the follow-up Walkman II model.
.
Amid fierce competition, primarily from Toshiba (the Walky), Aiwa (the CassetteBoy) and Panasonic, by the late 80s, Sony upped the ante once again by creating the playback-only WM-DD9, launched in 1989 during the 10th anniversary of the Walkman (five years after the WM-D6C) and became the holy grail for a niche group of cassette Walkman collectors. It is the only auto-reverse Walkman in history to use a two motor, quartz locked, disc drive system similar to high-end home cassette decks to ensure accurate tape speed for both sides of playback (only one motor operates at a time depending on the side of the tape being played).
Power consumption was improved by requiring only either one AA battery or one gumstick-type rechargeable, with optional AC adaptor input. It is also equipped with a tight gap amorphous tape head capable of reproducing the full 20–20,000 Hz frequency range, a gold plated headphone jack, and a 2 mm thick aluminum body. Sony made this model with only sound quality in mind; therefore it contains no gimmick features such as in-line remote control, music search, or LCD readout. Its only features are Dolby B/C noise reduction decoding, Mega Bass/DBB bass boost, tape type select, and two auto reverse modes.
By the late 1990s, the cassette-based Walkman was generally passed over in favor of the emerging digital technologies of CD, DAT and MiniDisc. After 2000, cassette-based Walkman products (and their clones) were approaching technological obsolescence as the cassette format was gradually phased out. However, Sony still continues to make cassette-based Walkman personal stereos today.
62. CD Walkman (Discman)
The first CD based Walkman was initially launched in 1984 — the D-50 (D-5 in some markets). It was officially called the 'Discman', and this name has since been used informally to refer to such players. In recent years, Sony has dropped the Discman name and markets all its personal stereos under the Walkman brand.
Later Discman models featured ESP (Electronic Skip Protection), which pre-read the music from the CD into on-board memory and formed a type of buffer to prevent the CD skipping when the player was moved. The technology was since renamed 'G-Protection' and features a larger memory area, providing additional protection against skipping.
Sony still makes CD Walkmans — the newer models are capable of playing ATRAC3, ATRAC3plus, and MP3 CDs, and have become progressively thinner and more compact with each revision. MiniDiscs come in a plastic caddy protecting the disc's surface from dust and scratches. MiniDisc Walkmans are able to play and record MiniDiscs from digital and analogue sources, such as live audio from their microphone inputs. The first unit on the market, the MZ-1 was relatively large and unpocketable, but following model, MZ-R2, and subsequent MD Walkmans are quite compact, with today's MiniDisc Walkmans not much larger than the discs themselves.
Gradual improvements were made to MiniDisc Walkmans through the years. The addition of MDLP (MiniDisc LongPlay) codec allowed up to 4 times the amount of music to be stored on one MiniDisc, at the sacrifice of some sound quality. NetMD followed. In 2004, Hi-MD was introduced, enabling computer files as well as CD-quality audio to be recorded on the discs for the first time. By 2005, Sony had relaxed the restrictions in its SonicStage software to allow unrestricted digital transfers to and from Hi-MD and the computer.
63. Remote Desktop Services
(Remote Desktop Services), formerly known as Terminal Services, is one of the components of Microsoft Windows (both server and client versions) that allows a user to access applications and data on a remote computer over a network. Terminal Services is Microsoft's implementation of thin-client terminal server computing, where Windows applications, or even the entire desktop of the computer running terminal services, are made accessible to a remote client machine.
applications, while still stored on a centralized server, are streamed to the client on-demand and then executed on the client machine.
Terminal Services was first introduced in Windows NT 4.0 Terminal Server Edition. It was significantly improved for Windows 2000 and Windows Server 2003. Both the underlying protocol as well as the service was again overhauled for Windows Vista and Windows Server 2008 [2]. Windows includes two client applications which utilize terminal services: the first, Remote Assistance is available in all versions of Windows XP and successors and allows one user to assist another user.
The second, Remote Desktop, allows a user to log in to a remote system and access the desktop, applications and data on the system as well as control it remotely. However, this is only available in certain Windows editions. These are Windows NT Terminal Server; subsequent Windows server editions, Windows XP Professional, and Windows Vista Business, Enterprise and Ultimate. In the client versions of Windows, Terminal Services supports only one logged in user at a time, whereas in the server operating systems, concurrent remote sessions are allowed.
Microsoft provides the client software Remote Desktop Connection (formerly called Terminal Services Client), available for most 32-bit versions of Windows including Windows Mobile and Apple's Mac OS X, that allows a user to connect to a server running Terminal Services. On Windows, both Terminal Services client and Remote Desktop Protocol (RDP) use TCP port 3389 by default, which is editable in the Windows registry. It also includes an ActiveX control to embed the functionality in other applications or even a web page
. A Windows CE version of the client software is also available. Server versions of Windows OSs also include the Remote Desktop for Administration client (a special mode of the Remote Desktop Connection client), which allows remote connection to the traditional session 0 console of the server. In Windows Vista and later this session is reserved for services and users always log onto session >0. The server functionality is provided by the Terminal Server component, which is able to handle Remote Assistance, Remote Desktop as well as the Remote Administration clients. Third-party developers have created client software for other platforms, including the open source rdesktop client for common Unix platforms.
For an enterprise, Terminal Services allows IT departments to install applications on a central server. For example, instead of deploying database or accounting software on all desktops, the applications can simply be installed on a server and remote users can log on and use them via the Internet. This centralization makes upgrading, troubleshooting, and software management much easier. As long as employees have Remote Desktop software, they will be able to use enterprise software. Terminal Services can also integrate with Windows authentication systems to prevent unauthorized users from accessing the applications or data.
64. Remote Control
A remote control is an electronic device used for the remote operation of a machine. The term remote control can be contracted to remote or controller. It is known by many other names as well, such as clicker, flipper or the changer. Commonly, remote controls are Consumer IR devices used to issue commands from a distance to televisions or other consumer electronics such as stereo systems DVD players and dimmers. Remote controls for these devices are usually small wireless handheld objects with an array of buttons for adjusting various settings such as television channel, track number, and volume. In fact, for the majority of modern devices with this kind of control, the remote contains all the function controls while the controlled device itself only has a handful of essential primary controls. Most of these remotes communicate to their respective devices via infrared (IR) signals and a few via radio signals. Television IR signals can be mimicked by a universal remote, which is able to emulate the functionality of most major brand television remote controls. They are usually powered by small AAA or AA size batteries
.
The first remote intended to control a television was developed by Zenith Radio Corporation in 1950. The remote — officially called "Lazy Bones" was connected to the television set by a wire. To improve the cumbersome setup, a wireless remote control called "Flashmatic" was developed in 1955 which worked by shining a beam of light onto a photoelectric cell. Unfortunately, the cells did not distinguish between light from the remote and light from other sources and the Flashmatic also required that the remote control be pointed very accurately at the receiver.
In 1956 Robert Adler developed "Zenith Space Command", a wireless remote.[3] It was mechanical and used ultrasound to change the channel and volume. When the user pushed a button on the remote control it clicked and struck a bar, hence the term "clicker". Each bar emitted a different frequency and circuits in the television detected this noise. The invention of the transistor made possible cheaper electronic remotes that contained a piezoelectric crystal that was fed by an oscillating electric current at a frequency near or above the upper threshold of human hearing, though still audible to dogs. The receiver contained a microphone attached to a circuit that was tuned to the same frequency.
Some problems with this method were that the receiver could be triggered accidentally by naturally occurring noises, and some people, especially young women, could hear the piercing ultrasonic signals. There was even a noted incident in which a toy xylophone changed the channels on these types of TVs since some of the overtones from the xylophone matched the remote's ultrasonic frequency.So BBC engineers began talks with one or two television manufacturers which led to early prototypes in around 1977-78 that could control a much larger number of functions. ITT was one of the companies and later gave its name to the ITT protocol of infrared communication.
65. Remote sensing
Remote sensing is the small or large-scale acquisition of information of an object or phenomenon, by the use of either recording or real-time sensing device(s) that are wireless, or not in physical or intimate contact with the object (such as by way of aircraft, spacecraft, satellite, buoy, or ship). In practice, remote sensing is the stand-off collection through the use of a variety of devices for gathering information on a given object or area. Thus, Earth observation or weather satellite collection platforms, ocean and atmospheric observing weather buoy platforms, the monitoring of a parolee via an ultrasound identification system, Magnetic Resonance Imaging (MRI), Positron Emission Tomography (PET), X-radiation (X-RAY) and space probes are all examples of remote sensing. In modern usage, the term generally refers to the use of imaging sensor technologies including but not limited to the use of instruments aboard aircraft and spacecraft, and is distinct from other imaging-related fields such as medical imaging.
There are two kinds of remote sensing. Passive sensors detect natural radiation that is emitted or reflected by the object or surrounding area being observed. Reflected sunlight is the most common source of radiation measured by passive sensors. Examples of passive remote sensors include film photography, Infrared, charge-coupled devices, and radiometers. Active collection, on the other hand, emits energy in order to scan objects and areas whereupon a sensor then detects and measures the radiation that is reflected or backscattered from the target. RADAR is an example of active remote sensing where the time delay between emission and return is measured, establishing the location, height, speeds and direction of an object.
Remote sensing makes it possible to collect data on dangerous or inaccessible areas. Remote sensing applications include monitoring deforestation in areas such as the Amazon Basin, the effects of climate change on glaciers and Arctic and Antarctic regions, and depth sounding of coastal and ocean depths. Military collection during the cold war made use of stand-off collection of data about dangerous border areas. Remote sensing also replaces costly and slow data collection on the ground, ensuring in the process that areas or objects are not disturbed.
Orbital platforms collect and transmit data from different parts of the electromagnetic spectrum, which in conjunction with larger scale aerial or ground-based sensing and analysis provides researchers with enough information to monitor trends such as El Niño and other natural long and short term phenomena. Other uses include different areas of the earth sciences such as natural resource management, agricultural fields such as land usage and conservation, and national security and overhead, ground-based and stand-off collection on border areas.
66. Remote procedure call
Remote procedure call (RPC) is an Inter-process communication technology that allows a computer program to cause a subroutine or procedure to execute in another address space (commonly on another computer on a shared network) without the programmer explicitly coding the details for this remote interaction. That is, the programmer would write essentially the same code whether the subroutine is local to the executing program, or remote. When the software in question is written using object-oriented principles, RPC may be referred to as remote invocation or remote method invocation.
Message passing
RPC is an obvious and popular paradigm for implementing the client-server model of distributed computing. An RPC is initiated by the client sending a request message to a known remote server in order to execute a specified procedure using supplied parameters. A response is returned to the client where the application continues along with its process. There are many variations and subtleties in various implementations, resulting in a variety of different (incompatible) RPC protocols. While the server is processing the call, the client is blocked (it waits until the server has finished processing before resuming execution).
An important difference between remote procedure calls and local calls is that remote calls can fail because of unpredictable network problems. Also, callers generally must deal with such failures without knowing whether the remote procedure was actually invoked. Idempotent procedures (those which have no additional effects if called more than once) are easily handled, but enough difficulties remain that code which calls remote procedures is often confined to carefully written low-level subsystems.
Standard contact mechanisms
In order to allow servers to be accessed by differing clients, a number of standardized RPC systems have been created. Most of these uses an interface description language (IDL) to allow various platforms to call the RPC.The IDL files can then be used to generate code to interface between the client and server. The most common tool used for this is RPCGEN.
67. Out-of-band management
In computing, out-of-band management (sometimes called lights-out management or LOM) involves the use of a dedicated management channel for device maintenance. It allows a system administrator to monitor and manage servers and other network equipment by remote control regardless of whether the machine is powered on. Out-of-band management addresses this limitation by employing a management channel that is physically isolated from the data channel
By contrast, in-band management is the use of regular data channels (usually through Ethernet) to manage devices. A significant limitation of in-band management is its vulnerability to problems from the very devices that are being managed. To manage network servers and routers remotely, IT administrators need network access when problems occur. However, the same problems that cause the network to go down also result in the loss of management access to those devices.
Beginning in the year 2000, the concept was formalized by an early out-of-band infrastructure for data pioneer Cyclades Corporation (which was later acquired by Avocent). It was quite clear that this technology was quickly becoming a core IT requirement when dealing with service levels across hundreds or thousands of geographically dispersed IT assets. OOBI, as it has been coined by Cyclades, uses many of the same concepts and provides similar features to the telecom industry's out-of-band infrastructures. Vendors of OOBI solutions began offering these cost-effective alternatives to local administration for data system and network management.
In its original conception, OOBI referred to the physical architecture and components that were used to construct an out-of-band network. A more accurate description would be a service port network since the OOBI connected to the service ports rather than the data ports on the target devices. This network provided the platform to implement out-of-band management (or service port management).
Just as in the past, a data OOBI provides alternate paths into the production infrastructure for the purpose of allowing disconnected assets to be remotely reconnected and subsequently returned to normal operation, in most cases eliminating the need for costly local administration. Some OOBI implementations include inherent enterprise-class security while others are constrained to the attributes of limited or proprietary mechanisms. An OOBI can improve operational efficiencies, cut costs, improve productivity and, in many cases, improve service levels and asset availability. Conceptually, data OOB infrastructures virtually guarantee a data "dial tone."
68. Canadarm
The Shuttle Remote Manipulator System (SRMS), or Canadarm (Canadarm 1), is a mechanical arm used on the Space Shuttle to maneuver a payload from the payload bay of the orbiter to its deployment position and then release it. It can also grapple a free-flying payload, maneuver it to the payload bay of the orbiter and berth it in the orbiter. It was first used on the second Space Shuttle mission STS-2, launched November 13, 1981. Since the destruction of Space Shuttle Columbia during STS-107, NASA has outfitted the Canadarm with the Orbiter Boom Sensor System, a boom containing instruments to inspect the exterior of the shuttle for damage to the thermal protection system. It is expected the CANADARM will play this role in all future shuttle missions.
Capabilities
The CANADARM is capable of deploying or retrieving payloads weighing up to 29 tons (65,000 pounds) in space, though the arm motors are unable to lift the arm's own weight when on the ground. The CANADARM can also retrieve, repair and deploy satellites; provide a mobile extension ladder for extravehicular activity crew members for work stations or foot restraints; and be used as an inspection aid to allow the flight crew members to view the orbiter's or payload's surfaces through a television camera on the CANADARM.
The basic CANADARM configuration consists of a manipulator arm; a CANADARM display and control panel, including rotational and translational hand controllers at the orbiter aft flight deck flight crew station; and a manipulator controller interface unit that interfaces with the orbiter computer. Most of the time the arm operators see what they are doing by looking at the Advanced Space Vision System screen next to the controllers.
One flight crew member operates the CANADARM from the aft flight deck control station, and a second flight crew member usually assists with television camera operations. This allows the CANADARM operator to view CANADARM operations through the aft flight deck payload and overhead windows and through the closed-circuit television monitors at the aft flight deck station. The Canadarm's hydraulics, whilst able to function smoothly in orbit, are not designed for functionality on the earths surface. NASA therefore developed a model of the arm for use at its training facility within the Johnson Space Center located in Houston, Texas.
69. Remote broadcast
In broadcast engineering, a remote broadcast (usually just called a remote or a live remote) is broadcasting done from a location away from the regular studio. A remote pickup unit (RPU) is usually used to transmit the audio and/or video back to the station, where it joins the normal air chain. Other methods include satellite trucks, and even regular telephone lines if necessary.
In radio, remotes are often used for special events, such as concerts or sporting events, where either the entire event or advertisements for the event are broadcast on location. The cost of personnel and equipment is usually paid for by the host at each performance. However, if the event is recurring, such as a weekly broadcast from a nightclub, then dedicated lines are usually installed by the local telephone company in order to save on costs.
Originally, analog audio broadcasts were sent through telephone hybrids, which, although low quality, were found to be acceptable for voice broadcasts. Later, frequency extenders were developed that used additional lines, shifting higher treble audio frequencies down on one end and back up on the other, providing a reasonable reproduction of the original sound. Currently, digital lines, such as ISDN or DSL, are used to send compressed digital audio back to the studio. In addition, modern remote pickup units have become extremely portable and can transmit single-channel monophonic FM-quality audio over regular telephone lines using build-in modems and advanced compression algorithms (MPEG-4, etc.).
In TV, live remotes are an almost daily part of news broadcasts in the U.S. As a part of electronic news gathering (ENG), remotes are meant to bring the audience to the scene of the action, although often nothing directly related to the story is happening there anymore. In some cases live remotes have even taken ridicule, particularly for reporters who stand out in the middle of a hurricane's winds, risking their lives while telling viewers not to do the same.
70. Plasma display
A plasma display panel (PDP) is a type of flat panel display common to large TV displays (32" inches or larger). Many tiny cells between two panels of glass hold an inert mixture of noble gases. The gas in the cells is electrically turned into a plasma which then excites phosphors to emit light. Plasma displays should not be confused with LCDs, another lightweight flatscreen display using different technology.Plasma displays are bright (1000 lux or higher for the module), have a wide color gamut, and can be produced in fairly large sizes, up to 381 cm (150 inches) diagonally. They have a very low-luminance "dark-room" black level compared to the lighter grey of the unilluminated parts of an LCD screen.
The display panel is only about 6 cm (2.5 inches) thick, while the total thickness, including electronics, is less than 10 cm (4 inches). Plasma displays use as much power per square meter as a CRT or an AMLCD television. Power consumption varies greatly with picture content, with bright scenes drawing significantly more power than darker ones, as is also true of CRTs. Nominal power rating is typically 400 watts for a 50-inch (127 cm) screen. Post-2006 models consume 220 to 310 watts for a 50-inch (127 cm) display when set to cinema mode. Most screens are set to 'shop' mode by default, which draws at least twice the power (around 500-700 watts) of a 'home' setting of less extreme brightness.
The lifetime of the latest generation of plasma displays is estimated at 100 000 hours of actual display time, or 27 years at 10 hours per day. This is the estimated time over which maximum picture brightness degrades to half the original value, not catastrophic failure.
Plasma displays also have their drawbacks. They are often criticized for reflecting more ambient light than LCD displays. The screen is made from glass, which reflects more light than the material used to make an LCD screen, which creates a glare. Companies such as Panasonic coat their newer plasma screens with an anti-glare filter. Plasma panels currently cannot be made in screen sizes smaller than 32". Although few companies have been able to make plasma EDTVs this small, even fewer have made 32" plasma HDTVs. The 32" screen size is also "going extinct". Plasma displays are also considered bulky and thick (usually six inches in depth) compared to their LCD counterparts, although 2009 high-end displays, such as Panasonic's Z1 and Samsung's B860 series can as slim as one inch thick. Plasma displays also tend to consume more electricity than LCD displays. Panasonic aims to solve this dilemma by using Neo-PDP screens for their 2009 series of Viera plasma HDTVs. Panasonic states that the PDPs will consume half the power of the previous series to achive the same overall brightness.
71. Video projector
A video projector takes a video signal and projects the corresponding image on a projection screen using a lens system. All video projectors use a very bright light to project the image, and most modern ones can correct any curves, blurriness, and other inconsistencies through manual settings. Video projectors are widely used for conference room presentations, classroom training, home theatre and live events applications. Projectors are widely used in many schools and other educational settings, connected to an interactive white board to interactively teach pupils.
A video projector, also known as a Digital Projector, may be built into a cabinet with a rear-projection screen (rear-projection TV, or RPTV) to form a single unified display device, now popular for “home theater” applications. Common display resolutions for a portable projector include SVGA (800×600 pixels), XGA (1024×768 pixels), 720p (1280×720 pixels), and 1080p (1920×1080 pixels).
The cost of a device is not only determined by its resolution, but also by its light output, acoustic noise output, contrast, and other characteristics. While most modern projectors provide sufficient light for a small screen at night or under controlled lighting such as in a basement with no windows[1], a projector with a higher light output (measured in lumens, abbreviated “lm”) is required for a larger screen or a room with a higher amount of ambient light. A rating of 1500 to 2500 ANSI lumens or lower is suitable for smaller screens with controlled lighting or low ambient light.
Between 2500 and 4000 lm is suitable for medium-sized screens with some ambient light or dimmed light. Over 4000 lm is appropriate for very large screens in a large room with no lighting control (for example, a conference room). Projected image size is important; because the total amount of light does not change, as size increases, brightness decreases. Image sizes are typically measured in linear terms, diagonally, obscuring the fact that larger images require much more light (proportional to the image area, not just the length of a side). Increasing the diagonal measure of the image by 25% reduces the image brightness by 35%; an increase of 41% reduces brightness by half.
72. Movie projector
A movie projector is an opto-mechanical device for displaying moving pictures by projecting them on a projection screen. Most of the optical and mechanical elements, except for the illumination and sound devices, are present in movie cameras.
Physiology
According to the theory of persistence of vision, the perceptual processes of the brain and the retina of the human eye retain an image for a brief moment of time. This theory is said to account for the illusion of motion which results when a series of film images is displayed in quick succession, rather than the perception of the individual frames in the series.Persistence of vision should be compared with the related phenomena of beta movement and phi movement. A critical part of understanding these visual perception phenomena is that the eye is not a camera, i.e.: there is no "frame rate" or "scan rate" in the eye. Instead, the eye/brain system has a combination of motion detectors, detail detectors and pattern detectors, the outputs of all of which are combined to create the visual experience.
The frequency at which flicker becomes invisible is called the flicker fusion threshold, and is dependent on the level of illumination. Generally, the frame rate of 16 frames per second (frame/s) is regarded as the lowest frequency at which continuous motion is perceived by humans. (Interestingly this threshold varies across different species; a higher proportion of rod cells in the retina will create a higher threshold level.)Close your eyelids, then periodically rapidly blink open and closed. If done fast enough you will be able to randomly "trap" the image between frames, or during shutter motion. This will not work with television due to the persistence of the phosphors nor with LCD or DLP light projectors due to the continuity of image; although certain color artifacts may appear with some digital projection technologies.
Since the birth of sound film, virtually all film projectors in commercial movie theaters project at a constant speed of 24 frame/s. This speed was chosen for financial and technical reasons - it was the slowest speed (and thus required the least film stock and was cheapest for producers) at which a satisfactory reproduction and amplification of sound could be conducted. There are some specialist formats (eg Showscan and Maxivision) which project at higher rates, often 48 frame/s.
73. Slide projector
A slide projector is an opto-mechanical device to view photographic slides. It has four main elements: a fan-cooled electric light bulb or other light source, a reflector and "condensing" lens to direct the light to the slide, a holder for the slide and a focusing lens. A flat piece of heat absorbing glass is often placed in the light path between the condensing lens and the slide, to avoid damaging the latter. This glass transmits visible wavelengths but absorbs infrared. Light passes through the transparent slide and lens, and the resulting image is enlarged and projected onto a perpendicular flat screen so the audience can view its reflection. Alternatively the image may be projected onto a translucent "rear projection" screen, often used for continuous automatic display for close viewing. This form of projection also avoids the audience's interrupting the light stream or bumping into the projector.
Slide projectors were common in the 1950s and 1960s as a form of entertainment; family members and friends would gather to view slideshows. In-home photographic slides and slide projectors have largely been replaced by low cost paper prints, digital cameras, DVD media, video display monitors and digital projectors.
As of October 2004, Kodak no longer manufactures slide projectors. It is also increasingly difficult in some countries to locate photo processors who will process slide film. A Large Format Slide Projector (also often called "Large Format Projector" or "Large Image Projector") is a kind of slide projector which has a very powerful light source (up to 12 thousand watts using arc lamps). Therefore it is necessary to use a large slide format to protect the slide material from overheating during the projection process (even when the light is filtered to only visible light and the slide is cooled with strong slide cooling fans). Slide formats of 18x18cm (7,1x7,1") or 24x24cm (9,4x9,4").
74. Digital television
Digital television (DTV) is the sending and receiving of moving images and sound by discrete (digital) signals, in contrast to the analog signals used by analog TV. Digital television supports many different picture formats defined by the combination of size, aspect ratio (height to width ratio) and interlacing. With terrestrial broadcasting in the USA, the range of formats can be coarsely divided into two categories: HDTV and SDTV. It should be noted that these terms by themselves are not very precise, and many subtle intermediate cases exist.
High-definition television (HDTV), one of several different formats that can be transmitted over DTV, uses one of two formats: 1280 × 720 pixels in progressive scan mode (abbreviated 720p) or 1920 × 1080 pixels in interlace mode (1080i). Each of these utilizes a 16:9 aspect ratio. (Some televisions are capable of receiving an HD resolution of 1920 × 1080 at a 60 Hz progressive scan frame rate — known as 1080p60, but this standard is not currently used for transmission.) HDTV cannot be transmitted over current analog channels.
Standard definition TV (SDTV), by comparison, may use one of several different formats taking the form of various aspect ratios depending on the technology used in the country of broadcast. For 4:3 aspect-ratio broadcasts, the 640 × 480 format is used in NTSC countries, while 720 × 576 (rescaled to 768 × 576) is used in PAL countries. For 16:9 broadcasts, the 704 × 480 (rescaled to 848 × 480) format is used in NTSC countries, while 720 × 576 (rescaled to 1024 × 576) is used in PAL countries. However, broadcasters may choose to reduce these resolutions to save bandwidth (e.g., many DVB-T channels in the United Kingdom use a horizontal resolution of 544 or 704 pixels per line).[3] This is done through the use of interlacing, in which the effective vertical resolution is halved to 288 lines.
Each commercial terrestrial DTV channel in North America is permitted to be broadcast at a data rate up to 19 megabits per second, or 2.375 megabytes per second. However, the broadcaster does not need to use this entire bandwidth for just one broadcast channel. Instead the broadcast can be subdivided across several video sub channels (aka feeds) of varying quality and compression rates, including non-video data casting services that allow one-way high-bandwidth streaming of data to computers.
75. Memory card
A memory card or flash memory card is a solid-state electronic flash memory data storage device capable of storing digital contents. These are mainly used with digital cameras, handheld and Mobile computers, mobile phones, music players, video game consoles, and other electronics. They offer high re-record-ability, power-free storage, small form factor, and rugged environmental specifications. There are also non-solid-state memory cards that do not use flash memory, and there are different types of flash memory.
There are many different types of memory cards and jobs they are used for. Some common places include in digital cameras, game consoles, cell phones, and industrial applications. PC card (PCMCIA) were among first commercial memory card formats (type I cards) to come out in the 1990s, but are now only mainly used in industrial applications and for I/O jobs (using types I/II/III), as a connection standard for devices (such as a modem). Also in 1990s, a number of memory card formats smaller than PC Card came out, including Compact Flash, Smart Media, and Miniature Card. In other areas, tiny embedded memory cards (SID) were used in cell phones, game consoles started using proprietary memory card formats, and devices like PDAs and digital music players started using removable memory cards.
From the late 1990s into the early 2000s a host of new formats appeared, including SD/MMC, Memory Stick, xD-Picture Card, and a number of variants and smaller cards. The desire for ultra-small cards for cell-phones, PDAs, and compact digital cameras drove a trend toward smaller cards that left the previous generation of "compact" cards looking big. In digital cameras Smart Media and Compact Flash had been very successful, in 2001 SM alone captured 50% of the digital camera market and CF had a strangle hold on professional digital cameras.
By 2005 however, SD/MMC had nearly taken over Smart Media’s spot, though not to the same level and with stiff competition coming from Memory Stick variants, xD, as well as Compact Flash. In industrial fields, even the venerable PC card (PCMCIA) memory cards still manage to maintain a niche, while in cell-phones and PDAs, the memory card market is highly fragmented.
76. Digital camera
A digital camera (or digicam for short) is a camera that takes video or still photographs, or both, digitally by recording images via electronic image sensor.Many compact digital still cameras can record sound and moving video as well as still photographs. In the Western market, digital cameras outsell their 35 mm film counterparts.
Digital cameras can do things film cameras cannot: displaying images on a screen immediately after they are recorded, storing thousands of images on a single small memory device, recording video with sound, and deleting images to free storage space. Digital cameras are incorporated into many devices ranging from PDAs and mobile phones (called camera phones) to vehicles. The Hubble Space Telescope and other astronomical devices are essentially specialized digital cameras.
Compact cameras are designed to be small and portable and are particularly suitable for casual and "snapshot" use, thus are also called point-and-shoot camera. The smallest, generally less than 20 mm thick, are described as subcompacts or "ultra-compacts". Compact cameras are usually designed to be easy to use, sacrificing advanced features and picture quality for compactness and simplicity; images can usually only be stored using lossy compression (JPEG). Most have a built-in flash usually of low power, sufficient for nearby subjects. Live preview is almost always used to frame the photo. They may have limited motion picture capability. Compacts often have macro capability, but if they have zoom capability the range is usually less than for bridge and DSLR cameras. They have a greater depth of field, allowing objects within a large range of distances from the camera to be in sharp focus.
Bridge cameras often have super zoom lenses which provide a very wide zoom range, typically between 10:1 and 18:1, which is attained at the cost of some distortions, including barrel and pincushion distortion, to a degree which varies with lens quality. These cameras are sometimes marketed as and confused with digital SLR cameras since the appearance is similar. Bridge cameras lack the mirror and reflex system of DSLRs, have so far been fitted with fixed (non-interchangeable) lenses (although in some cases accessory wide-angle or telephoto converters can be attached to the lens), can usually take movies with sound, and the scene is composed by viewing either the liquid crystal display or the electronic viewfinder (EVF). They are usually slower to operate than a true digital SLR, but they are capable of very good image quality (with sufficient light) while being more compact and lighter than DSLRs. The high-end models of this type have comparable resolutions to low and mid-range DSLRs. Many of these cameras can store images in lossless RAW format as an option to JPEG compression. The majority have a built-in flash, often a unit which flips up over the lens. The guide number tends to be between 11 and 15.
77. Digital Audio Broadcasting
Digital Audio Broadcasting (DAB), is a digital radio technology for broadcasting radio stations, used in several countries, particularly in Europe. As of 2006, approximately 1,000 stations worldwide broadcast in the DAB format.
The DAB standard was designed in the 1980s, and receivers have been available in many countries for several years. Proponents claim the standard offers several benefits over existing analogue FM radio, such as more stations in the same broadcast spectrum, and increased resistance to noise, multipath, fading, and co-channel interference. However, listening tests carried out by experts in the field of audio have shown that the audio quality on DAB is lower than on FM in the UK on stationary receivers, due to 98% of stereo stations using a bit rate of 128 kbit/s with the MP2 audio codec, which requires double that amount to achieve perceived CD quality.
An upgraded version of the system was released in February 2007, which is called DAB+. This is not backward-compatible with DAB, which means that DAB-only receivers will not be able to receive DAB+ broadcasts. DAB+ is approximately twice as efficient as DAB due to the adoption of the AAC+ audio codec, and DAB+ can provide high quality audio with as low as 64kbit/s. Reception quality will also be more robust on DAB+ than on DAB due to the addition of Reed-Solomon error correction coding.
The reception quality on DAB can be poor even for people that live well within the coverage area. The reason for this is that the old version of DAB uses weak error correction coding so that when there are a lot of errors with the received data not enough of the errors can be corrected and a "bubbling mud" sound occurs. In some cases a complete loss of signal can happen. This situation will be improved upon in the new DAB standard (DAB+, discussed below) that uses stronger error correction coding and as additional transmitters are built.
78. Modem:
Modem (from modulator-demodulator) is a device that modulates an analog carrier signal to encode digital information, and also demodulates such a carrier signal to decode the transmitted information. The goal is to produce a signal that can be transmitted easily and decoded to reproduce the original digital data. Modems can be used over any means of transmitting analog signals, from driven diodes to radio.
The most familiar example is a voice band modem that turns the digital 1s and 0s of a personal computer into sounds that can be transmitted over the telephone lines of Plain Old Telephone Systems (POTS), and once received on the other side, converts those 1s and 0s back into a form used by a USB, Ethernet, serial, or network connection. Modems are generally classified by the amount of data they can send in a given time, normally measured in bits per second, or "bps". They can also be classified by Baud, the number of times the modem changes its signal state per second.
Baud is not the modem's speed in bit/s, but in symbols/s. The baud rate varies, depending on the modulation technique used. Original Bell 103 modems used a modulation technique that saw a change in state 300 times per second. They transmitted 1 bit for every baud, and so a 300 bit/s modem was also a 300-baud modem. However, casual computerizes confused the two. A 300 bit/s modem is the only modem whose bit rate matches the baud rate. A 2400 bit/s modem changes state 600 times per second, but due to the fact that it transmits 4 bits for each baud, 2400 bits are transmitted by 600 baud, or changes in states.
Faster modems are used by Internet users every day, notably cable modems and ADSL modems. In telecommunications, "wide band radio modems" transmit repeating frames of data at very high data rates over microwave radio links. Narrow band radio modem is used for low data rate up to 19.2k mainly for private radio networks. Some microwave modems transmit more than a hundred million bits per second. Optical modems transmit data over optical fibers. Most intercontinental data links now use optical modems transmitting over undersea optical fibers. Optical modems routinely have data rates in excess of a billion (1x109) bits per second. One kilobit per second (kbit/s or kb/s or kbps) as used in this article means 1000 bits per second and not 1024 bits per second. For example, a 56k modem can transfer data at up to 56, 000 bits (7kB) per second over the phone line.
79. Hacker (computer security)
In common usage, a hacker is a person who breaks into computers, usually by gaining access to administrative controls. The subculture that has evolved around hackers is often referred to as the computer underground. Proponents claim to be motivated by artistic and political ends, and are often unconcerned about the use of criminal means to achieve them.
Several subgroups of the computer underground with different attitudes and aims use different terms to demarcate themselves from each other, or try to exclude some specific group with which they do not agree. Eric S. Raymond advocates that members of the computer underground should be called crackers. Yet, those people see themselves as hackers and even try to include the views of Raymond in what they see as one wider hacker culture, a view harshly rejected by Raymond himself. Instead of a hacker – cracker dichotomy, they give more emphasis to a spectrum of different categories, such as white hat (ethical hacking), grey hat, black hat and script kiddie. In contrast to Raymond, they usually reserve the term cracker to refer to black hat hackers, or more generally hackers with unlawful intentions.
A black hat hacker is someone who subverts computer security without authorization or uses technology (usually a computer or the Internet) for vandalism (malicious destruction), credit card fraud, identity theft, intellectual property theft, or other types of crime. A hacktivist is a hacker who utilizes technology to announce a social, ideological, religious, or political message. In general, most hactivism involves website defacement or denial-of-service attacks. In more extreme cases, hactivism is used as tool for Cyberterrorism.
The computer underground is heavily dependent on technology. It has produced its own slang and various forms of unusual alphabet use, for example 1337speak. Writing programs and performing other activities to support these views is referred to as hacktivism. Some go as far as seeing illegal cracking ethically justified for this goal; the most common form is website defacement.The computer underground is frequently compared to the Wild West: a male-dominated Frontier to conquer. It is common among hackers to use aliases for the purpose of concealing identity, rather than revealing their real names.
80. Antivirus software
Antivirus (or anti-virus) software is used to prevent, detect, and remove malware, including computer viruses, worms, and trojan horses. Such programs may also prevent and remove adware, spyware, and other forms of malware.
A variety of strategies are typically employed. Signature-based detection involves searching for known malicious patterns in executable code. However, it is possible for a user to be infected with new malware in which no signature exists yet. To counter such so called zero-day threats, heuristics can be used. One type of heuristic approach, generic signatures, can identify new viruses or variants of existing viruses for looking for known malicious code (or slight variations of such code) in files. Some antivirus software can also predict what a file will do if opened/run by emulating it in a sandbox and analyzing what it does to see if it performs any malicious actions. If it does, this could mean the file is malicious.
However, no matter how useful antivirus software is, it can sometimes have drawbacks. Antivirus software can degrade computer performance if it is not designed efficiently. Inexperienced users may have trouble understanding the prompts and decisions that antivirus software presents them with. An incorrect decision may lead to a security breach. If the antivirus software employs heuristic detection (of any kind), the success of it is going to depend on whether it achieves the right balance between false positives and false negatives. False positives can be as destructive as false negatives. In one case, a faulty virus signature issued by Symantec mistakenly removed essential operating system files, leaving thousands of PCs unable to boot. Finally, antivirus software generally runs at the highly trusted kernel level of the operating system, creating a potential avenue of attack.
An emerging technique to deal with malware in general is white listing. Rather than looking for only known bad software, this technique prevents execution of all computer code except that which has been previously identified as trustworthy by the system administrator. By following this "default deny" approach, the limitations inherent in keeping virus signatures up to date are avoided. Additionally, computer applications that are unwanted by the system administrator are prevented from executing since they are not on the white list. Since organizations often have large quantities of trusted applications, the limitations of adopting this technique rest with the system administrators' ability to properly inventory and maintain the white list of trusted applications. Viable implementations of this technique include tools for automating the inventory and white list maintenance processes
81. Malware
Many early infectious programs, including the first Internet Worm and a number of MS-DOS viruses, were written as experiments or pranks generally intended to be harmless or merely annoying rather than to cause serious damage to computers. In some cases the perpetrator did not realize how much harm their creations could do. Young programmers learning about viruses and the techniques wrote them for the sole purpose that they could or to see how far it could spread. As late as 1999, widespread viruses such as the Melissa virus appear to have been written chiefly as pranks.
Hostile intent related to vandalism can be found in programs designed to cause harm or data loss. Many DOS viruses, and the Windows ExploreZip worm, were designed to destroy files on a hard disk, or to corrupt the file system by writing invalid data. Network-borne worms such as the 2001 Code Red worm or the Ramen worm fall into the same category. Designed to vandalize web pages, these worms may seem like the online equivalent to graffiti tagging, with the author's alias or affinity group appearing everywhere the worm goes.
However, since the rise of widespread broadband Internet access, malicious software has come to be designed for a profit motive, either more or less legal (forced advertising) or criminal. For instance, since 2003, the majority of widespread viruses and worms have been designed to take control of users' computers for black-market exploitation. Infected "zombie computers" are used to send email spam, to host contraband data such as child pornography, or to engage in distributed denial-of-service attacks as a form of extortion.
Another strictly for-profit category of malware has emerged in spyware -- programs designed to monitor users' web browsing, display unsolicited advertisements, or redirect affiliate marketing revenues to the spyware creator. Spyware programs do not spread like viruses; they are generally installed by exploiting security holes or are packaged with user-installed software, such as peer-to-peer applications.
82. Computer worm
A computer worm is a self-replicating computer program. It uses a network to send copies of itself to other nodes (computers on the network) and it may do so without any user intervention. Unlike a virus, it does not need to attach itself to an existing program. Worms almost always cause at least some harm to the network, if only by consuming bandwidth, whereas viruses almost always corrupt or devour files on a targeted computer.
Many worms that have been created are only designed to spread, and don't attempt to alter the systems they pass through. However, as the Morris worm and Mydoom showed, the network traffic and other unintended effects can often cause major disruption. A "payload" is code designed to do more than spread the worm - it might delete files on a host system (e.g., the ExploreZip worm), encrypt files in a cryptoviral extortion attack, or send documents via e-mail. A very common payload for worms is to install a backdoor in the infected computer to allow the creation of a "zombie" computer under control of the worm author - Sobig and Mydoom are examples which created zombies. Networks of such machines are often referred to as botnets and are very commonly used by spam senders for sending junk email or to cloak their website's address. Spammers are therefore thought to be a source of funding for the creation of such worms, and the worm writers have been caught selling lists of IP addresses of infected machines. Others try to blackmail companies with threatened DoS attacks.
Beginning with the very first research into worms at Xerox PARC there have been attempts to create useful worms. The Nachi family of worms, for example, tried to download and install patches from Microsoft's website to fix vulnerabilities in the host system – by exploiting those same vulnerabilities. In practice, although this may have made these systems more secure, it generated considerable network traffic, rebooted the machine in the course of patching it, and did its work without the consent of the computer's owner or user.
Some worms, such as XSS worms, have been written for research to determine the factors of how worms spread, such as social activity and change in user behavior, while other worms are little more than a prank, such as one that sends the popular image macro of an owl with the phrase "O RLY?" to a print queue in the infected computer.
Worms spread by exploiting vulnerabilities in operating systems. All vendors supply regular security updates, and if these are installed to a machine then the majority of worms are unable to spread to it. If a vendor acknowledges vulnerability, but has yet to release a security update to patch it, a zero day exploit is possible. However, these are relatively rare.
83. Zombie computer
A zombie computer (often shortened as zombie) is a computer attached to the Internet that has been compromised by a hacker, a computer virus, or a trojan horse. Generally, a compromised machine is only one of many in a botnet, and will be used to perform malicious tasks of one sort or another under remote direction. Most owners of zombie computers are unaware that their system is being used in this way. Because the owner tends to be unaware, these computers are metaphorically compared to zombies.
Zombies have been used extensively to send e-mail spam; as of 2005, an estimated 50–80% of all spam worldwide was sent by zombie computers. This allows spammers to avoid detection and presumably reduces their bandwidth costs, since the owners of zombies pay for their own bandwidth. This spam also greatly furthers the spread of Trojan horses; as Trojans, like viruses are not self-replicating, unlike worms, they rely on the movement of e-mails or spam to grow.
Zombies can be used to conduct distributed denial-of-service attacks, a term which refers to the orchestrated flooding of target websites by armies of zombie computers. The large number of Internet users making simultaneous requests of a website’s server is intended to result in crashing and the prevention of legitimate users from accessing the site. A variant of this type of flooding is known as distributed degradation-of-service. Committed by "pulsing" zombies, distributed degradation-of-service is the moderated and periodical flooding of websites, done with the intent of slowing down rather than crashing a victim site. The effectiveness of this tactic springs from the fact that intense flooding can be quickly detected and remedied, but pulsing zombie attacks and the resulting slow-down in website access can go unnoticed for months and even years.
Notable incidents of distributed denial- and degradation-of-service attacks in past include the attack upon the SPEWS service in 2003, and the one against Blue Frog service in 2006. In 2000, several prominent Web sites (Yahoo, eBay, etc) were clogged to a standstill by a distributed denial of service attack mounted by a Canadian teenager. An attack on grc.com is discussed at length, and the perpetrator, a 13-year old probably from Kenosha, Wisconsin, was identified on the Gibson Research Web site. Steve Gibson disassembled a 'bot' which was a zombie used in the attack, and traced it to its distributor. In his account about his research, he describes the operation of a 'bot'-controlling IRC channel.
84. Trojan Horse (computing)
A Trojan horse, or trojan for short, is a term used to describe malware that appears, to the user, to perform a desirable function but, in fact, facilitates unauthorized access to the user's computer system. The term comes from the Trojan horse story in Greek mythology. Trojan horses are not self-replicating which distinguishes them from viruses and worms. Additionally, they require interaction with a hacker to fulfill their purpose. The hacker need not be the individual responsible for distributing the Trojan horse. It is possible for hackers to scan computers on a network using a port scanner in the hope of finding one with a Trojan horse installed
Trojan horses are designed to allow a hacker remote access to a target computer system. Once a Trojan horse has been installed on a target computer system it is possible for a hacker to access it remotely and perform operations. The type of operations that a hacker can perform is limited by user privileges on the target computer system and the design of the Trojan horse itself.
Operations which could be performed by a hacker on a target computer system include:
Deletion of files
Modification of files
Uploading of files
Downloading of files
Installation of software (including other malware)
Data Theft (e.g. passwords, security codes, credit card information)
Use of the machine as part of a Botnet (e.g. to perform Distributed Denial-of-service (DDoS) attacks)
Keystroke logging
Viewing the user's screen
This Trojan horse was distributed using email. Reports suggest that it was widely distributed and that there were several versions. The email sent to distribute the Trojan horse purported to be from Microsoft Corporation and to offer a free upgrade for Microsoft Internet Explorer. The email did not originate from Microsoft Corporation nor did it provide an upgrade for Microsoft Internet Explorer. The Trojan horse was an executable file named "ie0199.exe" and was provided as an email attachment. One version of the email included the message:
As a user of the Microsoft Internet Explorer, Microsoft Corporation provides you with this upgrade for your web browser. It will fix some bugs found in your Internet Explorer. To install the upgrade, please save the attached file (ie0199.exe) in some folder and run it. Once installed the Trojan horse reportedly modified system files and attempted to initiate contact with other remote systems.
85. Computer virus
A computer virus is a computer program that can copy itself and infect a computer without the permission or knowledge of the owner. The term "virus" is also commonly but erroneously used to refer to other types of malware, adware, and spyware programs that do not have the reproductive ability. A true virus can only spread from one computer to another (in some form of executable code) when its host is taken to the target computer; for instance because a user sent it over a network or the Internet, or carried it on a removable medium such as a floppy disk, CD, DVD, or USB drive. Viruses can increase their chances of spreading to other computers by infecting files on a network file system or a file system that is accessed by another computer
The term "computer virus" is sometimes used as a catch-all phrase to include all types of malware. Malware includes computer viruses, worms, trojan horses, most root kits, spyware, dishonest adware, crime ware, and other malicious and unwanted software), including true viruses. Viruses are sometimes confused with computer worms and Trojan horses, which are technically different.
A worm can exploit security vulnerabilities to spread itself to other computers without needing to be transferred as part of a host, and a Trojan horse is a program that appears harmless but has a hidden agenda. Worms and Trojans, like viruses, may cause harm to a computer system's hosted data, functional performance, or networking throughput, when they are executed. Some viruses and other malware have symptoms noticeable to the computer user, but many are surreptitious.
In order to replicate itself, a virus must be permitted to execute code and write to memory. For this reason, many viruses attach themselves to executable files that may be part of legitimate programs. If a user attempts to launch an infected program, the virus' code may be executed simultaneously. Viruses can be divided into two types based on their behavior when they are executed. Nonresident viruses immediately search for other hosts that can be infected, infect those targets, and finally transfer control to the application program they infected. Resident viruses do not search for hosts when they are started. Instead, a resident virus loads itself into memory on execution and transfers control to the host program. The virus stays active in the background and infects new hosts when those files are accessed by other programs or the operating system itself.
Resident viruses contain a replication module that is similar to the one that is employed by nonresident viruses. This module, however, is not called by a finder module. The virus loads the replication module into memory when it is executed instead and ensures that this module is executed each time the operating system is called to perform a certain operation. the replication module can be called, for example, each time the operating system executes a file. In this case the virus infects every suitable program that is executed on the computer.
86. Credit card
A credit card is part of a system of payments named after the small plastic card issued to users of the system. It is a card entitling its holder to buy goods and services based on the holder's promise to pay for these goods and services. The issuer of the card grants a line of credit to the consumer (or the user) from which the user can borrow money for payment to a merchant or as a cash advance to the user.
A credit card is different from a charge card, where a charge card requires the balance to be paid in full each month. In contrast, credit cards allow the consumers to 'revolve' their balance, at the cost of having interest charged. Most credit cards are issued by local banks or credit unions, and are the shape and size specified by the ISO/IEC 7810 standard as ID-1.
When a purchase is made, the credit card user agrees to pay the card issuer. The cardholder indicates consent to pay by signing a receipt with a record of the card details and indicating the amount to be paid or by entering a personal identification number (PIN). Also, many merchants now accept verbal authorizations via telephone and electronic authorization using the Internet, known as a 'Card/Cardholder Not Present' (CNP) transaction.
Electronic verification systems allow merchants to verify that the card is valid and the credit card customer has sufficient credit to cover the purchase in a few seconds, allowing the verification to happen at time of purchase. The verification is performed using a credit card payment terminal or Point of Sale (POS) system with a communications link to the merchant's acquiring bank. Data from the card is obtained from a magnetic stripe or chip on the card; the latter system is in the United Kingdom and Ireland commonly known as Chip and PIN, but is more technically an EMV card.
87. Debit card
A debit card (also known as a bank card or check card) is a plastic card which provides an alternative payment method to cash when making purchases. Functionally, it can be called an electronic check, as the funds are withdrawn directly from either the bank account, or from the remaining balance on the card. In some cases, the cards are designed exclusively for use on the Internet, and so there is no physical card.
The use of debit cards has become widespread in many countries and has overtaken the check and in some instances cash transactions by volume. Like credit cards, debit cards are used widely for telephone and Internet purchases, and unlike credit cards the funds are transferred from the bearer's bank account instead of having the bearer to pay back on a later date.
Debit cards can also allow for instant withdrawal of cash, acting as the ATM card for withdrawing cash and as a cheque guarantee card. Merchants can also offer "cashback"/"cashout" facilities to customers, where a customer can withdraw cash along with their purchase.
For consumers, the difference between a "debit card" and a "credit card" is that the debit card deducts the balance from a deposit account, like a checking account, where the credit card allows the consumer to spend money on credit to the issuing bank. In other words, a debit card uses the money you have and a credit card uses the money you don't have. "Debit cards" which are linked directly to a checking account are sometimes dual-purpose, so that they can be used as a credit card, and can be charged by merchants using the traditional credit networks. A merchant will ask for "credit or debit?" if the card is a combined credit+debit card. If the payee chooses "credit", the credit balance will be debited the amount of the purchase; if the payee chooses "debit", the bank account balance will be debited the amount of the purchase.
88. Computer-aided design
Computer-aided design (CAD) is the use of computer technology for the design of objects, real or virtual. The design of geometric models for object shapes, in particular, is often called computer-aided geometric design (CAGD).
However CAD often involves more than just shapes. As in the manual drafting of technical and engineering drawings, the output of CAD often must convey also symbolic information such as materials, processes, dimensions, and tolerances, according to application-specific conventions. CAD may be used to design curves and figures in two-dimensional ("2D") space; or curves, surfaces, or solids in three-dimensional ("3D") objects.
CAD is an important industrial art extensively used in many applications, including automotive, shipbuilding, and aerospace industries, industrial and architectural design, prosthetics, and many more. CAD is also widely used to produce computer animation for special effects in movies, advertising, technical manuals. The modern ubiquity and power of computers means that even perfume bottles and shampoo dispensers are designed using techniques unheard of by shipbuilders of 1960s. Because of its enormous economic importance, CAD has been a major driving force for research in computational geometry, computer graphics (both hardware and software), and discrete differential geometry.
Current Computer-Aided Design software packages range from 2D vector-based drafting systems to 3D solid and surface modellers. Modern CAD packages can also frequently allow rotations in three dimensions, allowing viewing of a designed object from any desired angle, even from the inside looking out. Some CAD software is capable of dynamic mathematic modeling, in which case it may be marketed as CADD — computer-aided design and drafting.
CAD is used in the design of tools and machinery and in the drafting and design of all types of buildings, from small residential types (houses) to the largest commercial and industrial structures (hospitals and factories).CAD is mainly used for detailed engineering of 3D models and/or 2D drawings of physical components, but it is also used throughout the engineering process from conceptual design and layout of products, through strength and dynamic analysis of assemblies to definition of manufacturing methods of components.
89. Satellite television
Satellite television is television delivered by the means of communications satellite and received by a satellite dish and set-top box. In many areas of the world it provides a wide range of channels and services, often to areas that are not serviced by terrestrial or cable providers. Satellites used for television signals are generally in either naturally highly elliptical (with inclination of +/-63.4 degrees and orbital period of about 12 hours, also known as Molniya orbit) or geostationary orbit 37,000 km (22,300 miles) above the earth’s equator.
Satellite television, like other communications relayed by satellite, starts with a transmitting antenna located at an uplink facility. Uplink satellite dishes are very large, as much as 9 to 12 meters (30 to 40 feet) in diameter. The increased diameter results in more accurate aiming and increased signal strength at the satellite. The uplink dish is pointed toward a specific satellite and the uplinked signals are transmitted within a specific frequency range, so as to be received by one of the transponders tuned to that frequency range aboard that satellite. The transponder 'retransmits' the signals back to Earth but at a different frequency band (a process known as translation, used to avoid interference with the uplink signal), typically in the C-band (4–8 GHz) or Ku-band (12–18 GHz) or both. The leg of the signal path from the satellite to the receiving Earth station is called the downlink.
A typical satellite has up to 32 transponders for Ku-band and up to 24 for a C-band only satellite, or more for hybrid satellites. Typical transponders each have a bandwidth between 27 MHz and 50 MHz. Each geo-stationary C-band satellite needs to be spaced 2 degrees from the next satellite (to avoid interference). For Ku the spacing can be 1 degree. This means that there is an upper limit of 360/2 = 180 geostationary C-band satellites and 360/1 = 360 geostationary Ku-band satellites. C-band transmission is susceptible to terrestrial interference while Ku-band transmission is affected by rain (as water is an excellent absorber of microwaves at this particular frequency).
The down linked satellite signal, quite weak after traveling the great distance (see inverse-square law), is collected by a parabolic receiving dish, which reflects the weak signal to the dish’s focal point. Mounted on brackets at the dish's focal point is a device called a feedhorn. This feedhorn is essentially the flared front-end of a section of waveguide that gathers the signals at or near the focal point and 'conducts' them to a probe or pickup connected to a low-noise block down converter or LNB. The LNB amplifies the relatively weak signals, filters the block of frequencies in which the satellite TV signals are transmitted, and converts the block of frequencies to a lower frequency range in the L-band range. The evolution of LNBs was one of necessity and invention. The original C-Band satellite TV systems used a Low Noise Amplifier connected to the feedhorn at the focal point of the dish. The amplified signal was then fed via very expensive 50 Ohm impedance coaxial cable to an indoor receiver or in other designs fed to a downconverter (a mixer and a voltage tuned oscillator with some filter circuitry) for downconversion to an intermediate frequency. The channel selection was controlled, typically by a voltage tuned oscillator with the tuning voltage being fed via a separate cable to the head end. But this simple design evolved.
90. Java (programming language)
Java is a programming language originally developed by James Gosling at Sun Microsystems and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities. Java applications are typically compiled to byte code (class file) that can run on any Java virtual machine (JVM) regardless of computer architecture.
The original and reference implementation Java compilers, virtual machines, and class libraries were developed by Sun from 1995. As of May 2007, in compliance with the specifications of the Java Community Process, Sun made available most of their Java technologies as free software under the GNU General Public License. Others have also developed alternative implementations of these Sun technologies, such as the GNU Compiler for Java and GNU Class path
One characteristic of Java is portability, which means that computer programs written in the Java language must run similarly on any supported hardware/operating-system platform. One should be able to write a program once, compile it once, and run it anywhere.
This is achieved by compiling the Java language code, not to machine code but to Java byte code – instructions analogous to machine code but intended to be interpreted by a virtual machine (VM) written specifically for the host hardware. End-users commonly use a Java Runtime Environment (JRE) installed on their own machine for standalone Java applications, or in a Web browser for Java applets.
Standardized libraries provide a generic way to access host specific features such as graphics, threading and networking. In some JVM versions, bytecode can be compiled to native code, either before or during program execution, resulting in faster execution. A major benefit of using bytecode is porting. However, the overhead of interpretation means that interpreted programs almost always run more slowly than programs compiled to native executables would, and Java suffered a reputation for poor performance. This gap has been narrowed by a number of optimization techniques introduced in the more recent JVM implementations.
91. Java Platform, Enterprise Edition
Java Platform, Enterprise Edition or Java EE is a widely used platform for server programming in the Java programming language. The Java platform (Enterprise Edition) differs from the Java Standard Edition Platform (Java SE) in that it adds libraries which provide functionality to deploy fault-tolerant, distributed, multi-tier Java software, based largely on modular components running on an application server.
The platform was known as Java 2 Platform, Enterprise Edition or J2EE until the name was changed to Java EE in version 5. The current version is called Java EE 5. The previous version is called J2EE 1.4.
Java EE is defined by its specification. As with other Java Community Process specifications, Java EE is also considered informally to be a standard since providers must agree to certain conformance requirements in order to declare their products as Java EE compliant; albeit with no ISO or ECMA standard.
Java EE includes several API specifications, such as JDBC, RMI, e-mail, JMS, web services, XML, etc, and defines how to coordinate them. Java EE also features some specifications unique to Java EE for components. These include Enterprise JavaBeans, servlets, portlets (following the Java Portlet specification), JavaServer Pages and several web service technologies. This allows developers to create enterprise applications that are portable and scalable, and that integrate with legacy technologies. A Java EE application server can handle transactions, security, scalability, concurrency and management of the components that are deployed to it, in order to enable developers to concentrate more on the business logic of the components rather than on infrastructure and integration tasks.
92. C (programming language)
C is a general-purpose computer programming language developed in 1972 by Dennis Ritchie at the Bell Telephone Laboratories for use with the Unix operating system.
Although C was designed for implementing system software, it is also widely used for developing portable application software’s is one of the most popular programming languages. It is widely used on many different software platforms, and there are few computer architectures for which a C compiler does not exist. C has greatly influenced many other popular programming languages, most notably C++, which originally began as an extension to C.
C is designed to provide high-level abstracts for all the native features of a general-purpose CPU, while at the same time allowing modularization, structure, and code re-use. Features specific to a particular program's function (features that are not general to all platforms) are not included in the language or library definitions. However any such specific functions are implement able and accessible as external reusable libraries, in order to encourage module dissemination and re-use. C is somewhat strongly typed (emitting warnings or errors) but allows programmers to override types in the interests of flexibility, simplicity or performance; while being natural and well-defined in its interpretation of type overrides.
C's design is tied to its intended use as a portable systems implementation language. Consequently, it does not require run-time checks for conditions that would never occur in correct programs, it provides simple, direct access to any addressable object (for example, memory-mapped device control registers), and its source-code expressions can be translated in a straightforward manner to primitive machine operations in the executable code. Some early C compilers were comfortably implemented (as a few distinct passes communicating via intermediate files) on PDP-11 processors having only 16 address bits.
C is an imperative (procedural) systems implementation language. It was designed to be compiled using a relatively straightforward compiler, to provide low-level access to memory, to provide language constructs that map efficiently to machine instructions, and to require minimal run-time support. C was therefore useful for many applications that had formerly been coded in assembly language. Despite its low-level capabilities, the language was designed to encourage machine-independent programming. A standards-compliant and portably written C program can be compiled for a very wide variety of computer platforms and operating systems with little or no change to its source code. The language has become available on a very wide range of platforms, from embedded microcontrollers to supercomputers.
93. Visual programming language
A visual programming language (VPL) is any programming language that lets users create programs by manipulating program elements graphically rather than by specifying them textually. A VPL allows programming with visual expressions, spatial arrangements of text and graphic symbols used either as elements of syntax or secondary notation. Many VPLs are based on the idea of "boxes and arrows," where boxes or other screen objects are treated as entities, connected by arrows, lines or arcs which represent relations.
VPLs may be further classified, according to the type and extent of visual expression used, into icon-based languages, form-based languages, and diagram languages. Visual programming environments provide graphical or iconic elements which can be manipulated by users in an interactive way according to some specific spatial grammar for program construction.
A visually transformed language is a non-visual language with a superimposed visual representation. Naturally visual languages have an inherent visual expression for which there is no obvious textual equivalent.
Current developments try to integrate the visual programming approach with dataflow programming languages to either have immediate access to the program state resulting in online debugging or automatic program generation and documentation (i.e. visual paradigm). Dataflow languages also allow automatic parallelization, which is likely to become one of the greatest programming challenges of the future.
94. Computer architecture
Computer architecture in computer engineering is the conceptual design and fundamental operational structure of a computer system. It is a blueprint and functional description of requirements and design implementations for the various parts of a computer, focusing largely on the way by which the central processing unit (CPU) performs internally and accesses addresses in memory.
The term “architecture” in computer literature can be traced to the work of Lyle R. Johnson and Frederick P. Brooks, Jr., members in 1959 of the Machine Organization department in IBM’s main research center. Johnson had occasion to write a proprietary research communication about Stretch, an IBM-developed supercomputer for Los Alamos Scientific Laboratory; in attempting to characterize his chosen level of detail for discussing the luxuriously embellished computer, he noted that his description of formats, instruction types, hardware parameters, and speed enhancements aimed at the level of “system architecture” – a term that seemed more useful than “machine organization.” Subsequently Brooks, one of the Stretch designers, started Chapter 2 of a book by writing, “Computer architecture, like other architecture, is the art of determining the needs of the user of a structure and then designing to meet those needs as effectively as possible within economic and technological constraints.” Brooks went on to play a major role in the development of the IBM System/360 line of computers, where “architecture” gained currency as a noun with the definition “what the user needs to know.” Later the computer world would employ the term in many less-explicit ways.
The first mention of the term architecture in the referred computer literature is in a 1964 article describing the IBM System/360. The article defines architecture as the set of “attributes of a system as seen by the programmer, i.e., the conceptual structure and functional behavior, as distinct from the organization of the data flow and controls, the logical design, and the physical implementation.” In the definition, the programmer perspective of the computer’s functional behavior is key. The conceptual structure part of an architecture description makes the functional behavior comprehensible and extrapolatable to a range of Use cases. Only later on did ‘internals’ such as “the way by which the CPU performs internally and accesses addresses in memory,” mentioned above, slip into the definition of computer architecture.
There are two main types of speed, latency and throughput. Latency is the time between the start of a process and its completion. Throughput is the amount of work done per unit time. Interrupt latency is the guaranteed maximum response time of the system to an electronic event (e.g. when the disk drive finishes moving some data). Performance is affected by a very wide range of design choices — for example, pipelining a processor usually makes latency worse (slower) but makes throughput better. Computers that control machinery usually need low interrupt latencies. These computers operate in a real-time environment and fail if an operation is not completed in a specified amount of time. For example, computer-controlled anti-lock brakes must begin braking almost immediately after they have been instructed to brake.
95. Data structure
In computer science, a data structure is a particular way of storing and organizing data in a computer so that it can be used efficiently. Different kinds of data structures are suited to different kinds of applications, and some are highly specialized to certain tasks. For example, B-trees are particularly well-suited for implementation of databases, while compiler implementations usually use hash tables to look up identifiers.
Data structures are used in almost every program or software system. Specific data structures are essential ingredients of many efficient algorithms, and make possible the management of huge amounts of data, such as large databases and internet indexing services. Some formal design methods and programming languages emphasize data structures, rather than algorithms, as the key organizing factor in software design.
Data structures are generally based on the ability of a computer to fetch and store data at any place in its memory, specified by an address — a bit string that can be itself stored in memory and manipulated by the program. Thus the record and array data structures are based on computing the addresses of data items with arithmetic operations; while the linked data structures are based on storing addresses of data items within the structure itself. Many data structures use both principles, sometimes combined in non-trivial ways (as in XOR linking).
The implementation of a data structure usually requires writing a set of procedures that create and manipulate instances of that structure. The efficiency of a data structure cannot be analyzed separately from those operations. This observation motivates the theoretical concept of an abstract data type, a data structure that is defined indirectly by the operations that may be performed on it, and the mathematical properties of those operations (including their space and time cost).
96. Database management system
A database management system (DBMS) is computer software that manages databases. DBMSes may use any of a variety of database models, such as the network model or relational model. In large systems, a DBMS allows users and other software to store and retrieve data in a structured way.
A DBMS is a set of software programs that controls the organization, storage, management, and retrieval of data in a database. DBMS are categorized according to their data structures or types. It is a set of prewritten programs that are used to store, update and retrieve a Database. The DBMS accepts requests for data from the application program and instructs the operating system to transfer the appropriate data. When a DBMS is used, information systems can be changed much more easily as the organization's information requirements change. New categories of data can be added to the database without disruption to the existing system.
Organizations may use one kind of DBMS for daily transaction processing and then move the detail onto another computer that uses another DBMS better suited for random inquiries and analysis. Overall systems design decisions are performed by data administrators and systems analysts. Detailed database design is performed by database administrators.
Database servers are computers that hold the actual databases and run only the DBMS and related software. Database servers are usually multiprocessor computers, with generous memory and RAID disk arrays used for stable storage. Connected to one or more servers via a high-speed channel, hardware database accelerators are also used in large volume transaction processing environments. DBMSs are found at the heart of most database applications. Sometimes DBMSs are built around a private multitasking kernel with built-in networking support although nowadays these functions are left to the operating system.
97. Data mining
Data mining is the process of extracting hidden patterns from data. As more data is gathered, with the amount of data doubling every three years,[1] data mining is becoming an increasingly important tool to transform this data into information. It is commonly used in a wide range of profiling practices, such as marketing, surveillance, fraud detection and scientific discovery.
While data mining can be used to uncover patterns in data samples, it is important to be aware that the use of non-representative samples of data may produce results that are not indicative of the domain. Similarly, data mining will not find patterns that may be present in the domain, if those patterns are not present in the sample being "mined". There is a tendency for insufficiently knowledgeable "consumers" of the results to attribute "magical abilities" to data mining, treating the technique as a sort of all-seeing crystal ball. Like any other tool, it only functions in conjunction with the appropriate raw material: in this case, indicative and representative data that the user must first collect. Further, the discovery of a particular pattern in a particular set of data does not necessarily mean that pattern is representative of the whole population from which that data was drawn. Hence, an important part of the process is the verification and validation of patterns on other samples of data.
The term data mining has also been used in a related but negative sense, to mean the deliberate searching for apparent but not necessarily representative patterns in large amounts of data. To avoid confusion with the other sense, the terms data dredging and data snooping are often used. Note, however, that dredging and snooping can be (and sometimes are) used as exploratory tools when developing and clarifying hypotheses.
Humans have been "manually" extracting patterns from data for centuries, but the increasing volume of data in modern times has called for more automatic approaches. Early methods of identifying patterns in data include Bayes' theorem (1700's) and Regression analysis (1800's). The proliferation, ubiquity and increasing power of computer technology has increased data collection and storage. As data sets have grown in size and complexity, direct hands-on data analysis has increasingly been augmented with indirect, automatic data processing. This has been aided by other discoveries in computer science, such as neural networks, Clustering, Genetic algorithms (1950's), Decision trees (1960's) and Support vector machines (1980's). Data mining is the process of applying these methods to data with the intention of uncovering hidden patterns. It has been used for many years by businesses, scientists and governments to sift through volumes of data such as airline passenger trip records, census data and supermarket scanner data to produce market research reports. (Note, however, that reporting is not always considered to be data mining).
98. Data warehouse
Data warehouse is a repository of an organization's electronically stored data. Data warehouses are designed to facilitate reporting and analysis.
This definition of the data warehouse focuses on data storage. However, the means to retrieve and analyze data, to extract, transform and load data, and to manage the data dictionary are also considered essential components of a data warehousing system. Many references to data warehousing use this broader context. Thus, an expanded definition for data warehousing includes business intelligence tools, tools to extract, transform, and load data into the repository, and tools to manage and retrieve metadata.In contrast to data warehouses are operational databases that support day-to-day transaction processing.
The concept of data warehousing dates back to the late 1980s when IBM researchers Barry Devlin and Paul Murphy developed the "business data warehouse". In essence, the data warehousing concept was intended to provide an architectural model for the flow of data from operational systems to decision support environments. The concept attempted to address the various problems associated with this flow - mainly, the high costs associated with it. In the absence of a data warehousing architecture, an enormous amount of redundancy was required to support multiple decision support environments. In larger corporations it was typical for multiple decision support environments to operate independently.
Each environment served different users but often required much of the same data. The process of gathering, cleaning and integrating data from various sources, usually long existing operational systems (usually referred to as legacy systems), was typically in part replicated for each environment. Moreover, the operational systems were frequently reexamined as new decision support requirements emerged. Often new requirements necessitated gathering, cleaning and integrating new data from the operational systems that were logically related to prior gathered data.
99. Distributed computing
Distributed computing deals with hardware and software systems containing more than one processing element or storage element, concurrent processes, or multiple programs, running under a loosely or tightly controlled regime.
In distributed computing a program is split up into parts that run simultaneously on multiple computers communicating over a network. Distributed computing is a form of parallel computing, but parallel computing is most commonly used to describe program parts running simultaneously on multiple processors in the same computer. Both types of processing require dividing a program into parts that can run simultaneously, but distributed programs often must deal with heterogeneous environments, network links of varying latencies, and unpredictable failures in the network or the computers.
Organizing the interaction between the computers that execute distributed computations is of prime importance. In order to be able to use the widest possible variety of computers, the protocol or communication channel should not contain or use any information that may not be understood by certain machines. Special care must also be taken that messages are indeed delivered correctly and that invalid messages, which would otherwise bring down the system and perhaps the rest of the network, are rejected.
Another important factor is the ability to send software to another computer in a portable way so that it may execute and interact with the existing network. This may not always be practical when using differing hardware and resources, in which case other methods, such as cross-compiling or manually porting this software, must be used.
There are many different types of distributed computing systems and many challenges to overcome in successfully designing one. The main goal of a distributed computing system is to connect users and resources in a transparent, open, and scalable way. Ideally this arrangement is drastically more fault tolerant and more powerful than many combinations of stand-alone computer systems.
100. Grid computing
Grid computing (or the use of computational grids) is the application of several computers to a single problem at the same time — usually to a scientific or technical problem that requires a great number of computer processing cycles or access to large amounts of data.
One of the main grid computing strategies is to use software to divide and apportion pieces of a program among several computers, sometimes up to many thousands. Grid computing can also be thought of as distributed and large-scale cluster computing, as well as a form of network-distributed parallel processing[citation needed]. It can be small — confined to a network of computer workstations within a corporation, for example — or it can be a large, public collaboration across many companies or networks. "The notion of a confined grid may also known as intra-nodes cooperation whilst the notion of a larger, wider grid may thus refer to inter-nodes cooperation". This inter-/intra-nodes cooperation "across cyber-based collaborative organizations are also known as Virtual Organizations".
It is a form of distributed computing whereby a “super and virtual computer” is composed of a cluster of networked, loosely coupled computers, acting in concert to perform very large tasks. This technology has been applied to computationally intensive scientific, mathematical, and academic problems through volunteer computing, and it is used in commercial enterprises for such diverse applications as drug discovery, economic forecasting, seismic analysis, and back-office data processing in support of e-commerce and Web services.
What distinguishes grid computing from conventional cluster computing systems is that grids tend to be more loosely coupled, heterogeneous, and geographically dispersed. Also, while a computing grid may be dedicated to a specialized application, it is often constructed with the aid of general-purpose grid software libraries and middleware.
101. Digital image processing
Digital image processing is the use of computer algorithms to perform image processing on digital images. As a subfield of digital signal processing, digital image processing has many advantages over analog image processing; it allows a much wider range of algorithms to be applied to the input data, and can avoid problems such as the build-up of noise and signal distortion during processing.
Many of the techniques of digital image processing, or digital picture processing as it was often called, were developed in the 1960s at the Jet Propulsion Laboratory, MIT, Bell Labs, University of Maryland, and a few other places, with application to satellite imagery, wirephoto standards conversion, medical imaging, videophone, character recognition, and photo enhancement. But the cost of processing was fairly high with the computing equipment of that era. In the 1970s, digital image processing proliferated, when cheaper computers and dedicated hardware became available. Images could then be processed in real time, for some dedicated problems such as television standards conversion. As general-purpose computers became faster, they started to take over the role of dedicated hardware for all but the most specialized and compute-intensive operations.
With the fast computers and signal processors available in the 2000s, digital image processing has become the most common form of image processing, and is generally used because it is not only the most versatile method, but also the cheapest.
Digital cameras generally include dedicated digital image processing chips to convert the raw data from the image sensor into a color-corrected image in a standard image file format. Images from digital cameras often receive further processing to improve their quality; distinct advantage digital cameras have over film cameras. The digital image processing is typically done by special software programs that can manipulate the images in many ways. Many digital cameras also enable viewing of histograms of images, as an aid for the photographer to better understand the rendered brightness range of each shot.
102. Computed tomography
Computed tomography (CT) is a medical imaging method employing tomography. Digital geometry processing is used to generate a three-dimensional image of the inside of an object from a large series of two-dimensional X-ray images taken around a single axis of rotation. The word "tomography" is derived from the Greek tomos (slice) and graphein (to write). Computed tomography was originally known as the "EMI scan" as it was developed at a research branch of EMI, a company best known today for its music and recording business. It was later known as computed axial tomography (CAT or CT scan) and body section röntgenography.
CT produces a volume of data which can be manipulated, through a process known as "windowing", in order to demonstrate various bodily structures based on their ability to block the X-ray/Röntgen beam. Although historically the images generated were in the axial or transverse plane, orthogonal to the long axis of the body, modern scanners allow this volume of data to be reformatted in various planes or even as volumetric (3D) representations of structures. Although most common in medicine, CT is also used in other fields, such as nondestructive materials testing. Another example is the DigiMorph project at the University of Texas at Austin which uses a CT scanner to study biological and paleontological specimens.
It has been claimed that thanks to the success of The Beatles, EMI could fund research and build early models for medical use.[3] The first production X-ray CT machine (in fact called the "EMI-Scanner") was limited to making tomographic sections of the brain, but acquired the image data in about 4 minutes (scanning two adjacent slices), and the computation time (using a Data General Nova minicomputer) was about 7 minutes per picture. This scanner required the use of a water-filled Perspex tank with a pre-shaped rubber "head-cap" at the front, which enclosed the patient's head. The water-tank was used to reduce the dynamic range of the radiation reaching the detectors (between scanning outside the head compared with scanning through the bone of the skull). The images were relatively low resolution, being composed of a matrix of only 80 x 80 pixels.
In the U.S., the first installation was at the Mayo Clinic. As a tribute to the impact of this system on medical imaging the Mayo Clinic has an EMI scanner on display in the Radiology Department. The first CT system that could make images of any part of the body and did not require the "water tank" was the ACTA (Automatic Computerized Transverse Axial) scanner designed by Robert S. Ledley, DDS. This machine had 30 photomultiplier tubes as detectors and completed a scan in only 9 translate/rotate cycles, much faster than the EMI-scanner. It used a DEC PDP11/34 minicomputer both to operate the servo-mechanisms and to acquire and process the images. The Pfizer drug company acquired the prototype from the university, along with rights to manufacture it. Pfizer then began making copies of the prototype, calling it the "200FS" (FS meaning Fast Scan), which were selling as fast as they could make them. This unit produced images in a 256x256 matrix, with much better definition than the EMI-Scanner's 80x80
103. Surgical instrument
A surgical instrument is a specially designed tool or device for performing specific actions of carrying out desired effects during a surgery or operation, such as modifying biological tissue, or to provide access for viewing it. Over time, many different kinds of surgical instruments and tools have been invented. Some surgical instruments are designed for general use in surgery, while others are designed for a specific procedure or surgery. Accordingly, the nomenclature of surgical instruments follows certain patterns, such as a description of the action it performs (for example, scalpel, hemostat), the name of its inventor(s) (for example, the Kocher forceps), or a compound scientific name related to the kind of surgery (for example, a tracheotome is a tool used to perform a tracheotomy).
The expression surgical instrumentation is somewhat interchangeably used with surgical instruments, but its meaning in medical jargon is really the activity of providing assistance to a surgeon with the proper handling of surgical instruments during an operation, by a specialized professional, usually a surgical technologist or sometimes a nurse.
One of the key players who made the real breakthrough in surgical instrumentation was Abu al-Qasim al-Zahrawi, known in the West as Abulcasis, and considered the "father of modern surgery". The first observation one must make at the outset is that Al-Zahrawi wrote his famous Al-Tasrif liman 'Ajiza 'an Al Ta'leef (written in 1000 CE), translated as The Method of Medicine, and often referred to as Al-Tasrif, after long experience accumulated over fifty years of practicing medicine. The book, therefore, was aimed to establish the general guidelines in the practical medicine by emphasizing the "do" and "don’t" in almost every issue encountered and the solutions/ treatments he provided or invented during this long experience. To complete his practical guide to solving various surgical problems, Al-Zahrawi ended this thirty volumes medical encyclopedia with a treatise in which he introduces his famous collection of surgical tools exceeding a staggering total of 200 pieces. With its innovative title “On Surgery”, the treatise is considered the earliest elucidation compiled on the subject, which remained as the single best medieval source on the matter until modern times. In the words of Leclerc: “Al-Zahrawi remains a leading scholar who transformed surgery into an independent science based on the knowledge of anatomy. His illustration and drawing of the tools is an innovation that keeps his contribution alive, reflected in its continuous influence on the works of those who came after him”
Additionally, Galen of Pergamum, one of the most profound philosophers, surgeons, medical philologists and physicians of the ancient world, requested that his specialized surgical instruments be made of iron ore found only in a quarry in the Celtic kingdom of Noricum. Galen along with other early Arab doctors pioneered the approach to medical instrumentation and his followers of the medieval period manufactured their instruments based on Galen's early designs.
104. Surgical pathology
Surgical pathology is the most significant and time-consuming area of practice for most anatomical pathologists. Surgical pathology involves the gross and microscopic examination of surgical specimens, as well as biopsies submitted by non-surgeons such as general internists, medical subspecialists, dermatologists, and interventional radiologists.
The practice of surgical pathology allows for definitive diagnosis of disease (or lack thereof) in any case where tissue is surgically removed from a patient. This is usually performed by a combination of gross (i.e. macroscopic) and histologic (i.e. microscopic) examination of the tissue, and may involve evaluations of molecular properties of the tissue by immunohistochemistry or other laboratory tests.
A biopsy is a small piece of tissue removed primarily for the purposes of surgical pathology analysis, most often in order to render a definitive diagnosis. Types of biopsies include core biopsies which are obtained through the use of large-bore needles, sometimes under the guidance of radiological techniques such as ultrasound, CT scan, or magnetic resonance imaging. Core biopsies, which preserve tissue architecture, should not be confused with fine needle aspiration specimens, which are analyzed using cytopathology techniques. Incisional biopsies are obtained through diagnostic surgical procedures which remove part of a suspicious lesion, while excisional biopsies remove the entire lesion, and are similar to therapeutic surgical resections. Excisional biopsies of skin lesions and gastrointestinal polyps are very common. The pathologist's interpretation of a biopsy is critical to establishing the diagnosis of a benign or malignant tumor, and can differentiate between different types and grades of cancer, as well as determining the activity of specific molecular pathways in the tumor. This information is important for estimating the patient's prognosis and for choosing the best treatment to administer. Biopsies are also used to diagnose diseases other than cancer, including inflammatory, infectious, or idiopathic diseases of the skin and gastrointestinal tract, to name only a few.
Surgical resection specimens are obtained by the therapeutic surgical removal of an entire diseased area or organ (and occasionally multiple organs). These procedures are often intended as definitive surgical treatment of a disease in which the diagnosis is already known or strongly suspected. However, pathological analysis of these specimens is critically important in confirming the previous diagnosis, staging the extent of malignant disease, establishing whether or not the entire diseased area was removed (sometimes intraoperatively through the technique of frozen section), identifying the presence of unsuspected concurrent diseases, and providing information for postoperative treatment, such as adjuvant chemotherapy in the case of cancer.
105. Surgical oncology
Surgical oncology is the branch of surgery which focuses on the surgical management of malignant neoplasms (cancer).
Whether surgical oncology constitutes a medical specialty per se is the topic of a heated debate. Today, some would agree that it is simply impossible for any one surgeon to be competent in the surgical management of all malignant disease. However, there are currently 14 surgical oncology fellowship training programs in the United States that have been approved by the Society of Surgical Oncology. While many general surgeons are actively involved in treating patients with malignant neoplasms, the designation of "surgical oncologist" is generally reserved for those surgeons who have completed one of the approved fellowship programs. However, this is a matter of semantics, as many surgeons who are thoroughly involved in treating cancer patients may consider themselves to be surgical oncologists.
Most often, surgical oncologist refers to a general surgical oncologist (cf. General Surgery), but thoracic surgical oncologists, gynecologic oncologists and so forth can all be considered surgeons who specialize in treating cancer patients.
The importance of training surgeons who sub-specialize in cancer surgery lies in evidence, supported by a number of clinical trials, that outcomes in surgical cancer care are positively associated to surgeon volume -- i.e. the more cancer cases a surgeon treats, the more proficient he becomes, and his or her patients experience improved survival rates as a result. This is another controversial point, but it is generally accepted -- even as common sense -- that a surgeon, who performs a given operation more often, will achieve superior results when compared with a surgeon who rarely performs the same procedure. This is particularly true of cancer resections such as pancreaticoduodenectomy (Whipple procedure) for pancreatic cancer, and gastrectomy with extended (D2) lymphadenectomy for gastric cancer.
The specialty of Surgical Oncology has evolved in steps similar to Medical Oncology out of Hematology and Radiation Oncology out of Radiology. The Ewing Society known today as the Society of Surgical Oncology was started by surgeons interested in promoting the field of oncology. Though is not blessed by a specialty Board certification, the area of expertise is coming to its own by the success of combined treatment with chemotherapy, radiation and targeted biologic treatments. The proliferation of cancer centers will further popularize the field as will developments in minimally invasive techniques, palliative surgery, neoadjuvant treatments and cancer prevention.
106. Natural rubber
Natural rubber is an elastomer (an elastic hydrocarbon polymer) that was originally derived from a milky colloidal suspension, or latex, found in the sap of some plants. The purified form of natural rubber is the chemical polyisoprene which can also be produced synthetically. Natural rubber is used extensively in many applications and products as is synthetic rubber. The entropy model of rubber was developed in 1934 by Werner Kuhn.
In most elastic materials, such as metals used in springs, the elastic behavior is caused by bond distortions. When force is applied, bond lengths deviate from the (minimum energy) equilibrium and strain energy is stored electrostatically. Rubber is often assumed to behave in the same way, but it turns out this is a poor description. Rubber is a curious material because, unlike metals, strain energy is stored thermally.
In its relaxed state rubber consists of long, coiled-up polymer chains that are interlinked at a few points. Between a pair of links each monomer can rotate freely about its neighbor. This gives each section of chain leeway to assume a large number of geometries, like a very loose rope attached to a pair of fixed points. At room temperature rubber stores enough kinetic energy so that each section of chain oscillates chaotically, like the above piece of rope being shaken violently.
When rubber is stretched the "loose pieces of rope" are taut and thus no longer able to oscillate. Their kinetic energy is given off as excess heat. Therefore, the entropy decreases when going from the relaxed to the stretched state, and it increases during relaxation. This change in entropy can also be explained by the fact that a tight section of chain can fold in fewer ways (W) than a loose section of chain, at a given temperature (nb. entropy is defined as S=k*ln(W)). Relaxation of a stretched rubber band is thus driven by an increase in entropy, and the force experienced is not electrostatic, rather it is a result of the thermal energy of the material being converted to kinetic energy. Rubber relaxation is endothermic, and for this reason the force exerted by a stretched piece of rubber increases with temperature (metals, for example, become softer as temperature increases). The material undergoes adiabatic cooling during contraction. This property of rubber can easily be verified by holding a stretched rubber band to your lips and relaxing it.
107. Natural gas
Natural gas is a gas consisting primarily of methane. It is found associated with fossil fuels, in coal beds, as methane clathrates, and is created by methanogenic organisms in marshes, bogs, and landfills. It is an important fuel source, a major feedstock for fertilizers, and a potent greenhouse gas.
Natural gas is often informally referred to as simply gas, especially when compared to other energy sources such as electricity. Before natural gas can be used as a fuel, it must undergo extensive processing to remove almost all materials other than methane. The by-products of that processing include ethane, propane, butanes, pentanes and higher molecular weight hydrocarbons, elemental sulfur, and sometimes helium and nitrogen.
Fossil natural gas can be "associated" (found in oil fields) or "non-associated" (isolated in natural gas fields), and is also found in coal beds (as coalbed methane). It sometimes contains significant quantities of ethane, propane, butane, and pentane—heavier hydrocarbons removed prior to use as a consumer fuel—as well as carbon dioxide, nitrogen, helium and hydrogen sulfide.[1] Natural gas is commercially produced from oil fields and natural gas fields. Gas produced from oil wells is called casinghead gas or associated gas. The natural gas industry is producing gas from increasingly more challenging resource types: sour gas, tight gas, shale gas and coalbed methane.
When methane-rich gases are produced by the anaerobic decay of non-fossil organic matter (biomass), these are referred to as biogas (or natural biogas). Sources of biogas include swamps, marshes, and landfills (see landfill gas), as well as sewage sludge and manure by way of anaerobic digesters, in addition to enteric fermentation particularly in cattle.
Methanogenic archaea are responsible for all biological sources of methane, some in symbiotic relationships with other life forms, including termites, ruminants, and cultivated crops. Methane released directly into the atmosphere would be considered a pollutant, however, methane in the atmosphere is oxidised, producing carbon dioxide and water. Methane in the atmosphere has a half life of seven years, meaning that every seven years, half of the methane present is converted to carbon dioxide and water.
108. Naturopathy
Naturopathy (also known as naturopathic medicine or natural medicine) is an eclectic alternative medical system that focuses on natural remedies and the body's vital ability to heal and maintain itself. Naturopathic philosophy favors a holistic approach and minimal use of surgery and drugs. Naturopathy comprises many different treatment modalities of varying degrees of acceptance by the medical community; diet and lifestyle advice may be substantially similar to that offered by non-naturopaths, and acupuncture may help reduce pain in some cases, but homeopathy is often characterized as pseudoscience or quackery.
Naturopathy has its origins in the Nature Cure movement of Europe. The term was coined in 1895 by John Scheel and popularized by Benedict Lust, the "father of U.S. naturopathy".
Naturopathy is practiced in many countries, especially the United States and Canada, and is subject to different standards of regulation and levels of acceptance. The level of medical education among naturopaths also varies, though no naturopathic training program reaches the same level of training as an MD or DO. In the United States and Canada, the designation of Naturopathic Doctor (ND) may be awarded after completion of a four year program of study at an accredited Naturopathic medical school that includes the study of basic medical sciences as well as natural remedies and medical care. The scope of practice varies widely between jurisdictions, and naturopaths in unregulated jurisdictions may use the Naturopathic Doctor designation or other titles regardless of level of education.
After a period of rapid growth, naturopathy went into decline for several decades after the 1930s. In 1910, the Carnegie Foundation for the Advancement of Teaching published the Flexner Report, which criticized many aspects of medical education, especially quality and lack of scientific rigour. The advent of penicillin and other "miracle drugs" and the consequent popularity of modern medicine also contributed to naturopathy's decline. Following Lust's death in 1945, the ANA split into six distinct organizations. In the 1940s and 1950s, a broadening in scope of practice laws led many chiropractic schools to drop their N.D. degrees, though many chiropractors continued to practice naturopathy. From 1940 to 1963, the American Medical Association campaigned against heterodox medical systems. By 1958, practice of naturopathy was licensed in only five states. In 1968, the United States Department of Health, Education, and Welfare issued a report on naturopathy concluding that naturopathy was not grounded in medical science and that naturopathic education was inadequate to prepare graduates to make appropriate diagnosis and provide treatment; the report recommends against expanding Medicare coverage to include naturopathic treatments. In 1977, an Australian committee of inquiry reached similar conclusions; it did not recommend licensure for naturopaths.
109. Earthquake
An earthquake (also known as a tremor or temblor) is the result of a sudden release of energy in the Earth's crust that creates seismic waves. Earthquakes are recorded with a seismometer, also known as a seismograph. The moment magnitude of an earthquake is conventionally reported, or the related and mostly obsolete Richter magnitude, with magnitude 3 or lower earthquakes being mostly imperceptible and magnitude 7 causing serious damage over large areas. Intensity of shaking is measured on the modified Mercalli scale.
At the Earth's surface, earthquakes manifest themselves by shaking and sometimes displacing the ground. When a large earthquake epicenter is located offshore, the seabed sometimes suffers sufficient displacement to cause a tsunami. The shaking in earthquakes can also trigger landslides and occasionally volcanic activity.
In its most generic sense, the word earthquake is used to describe any seismic event — whether a natural phenomenon or an event caused by humans — that generates seismic waves. Earthquakes are caused mostly by rupture of geological faults, but also by volcanic activity, landslides, mine blasts, and nuclear experiments. An earthquake's point of initial rupture is called its focus or hypocenter. The term epicenter refers to the point at ground level directly above the hypocenter.
Tectonic earthquakes will occur anywhere within the earth where there is sufficient stored elastic strain energy to drive fracture propagation along a fault plane. In the case of transform or convergent type plate boundaries, which form the largest fault surfaces on earth, they will move past each other smoothly and aseismically only if there are no irregularities or asperities along the boundary that increase the frictional resistance. Most boundaries do have such asperities and this leads to a form of stick-slip behavior. Once the boundary has locked, continued relative motion between the plates leads to increasing stress and therefore, stored strain energy in the volume around the fault surface. This continues until the stress has risen sufficiently to break through the asperity, suddenly allowing sliding over the locked portion of the fault, releasing the stored energy. This energy is released as a combination of radiated elastic strain seismic waves, frictional heating of the fault surface, and cracking of the rock, thus causing an earthquake. This process of gradual build-up of strain and stress punctuated by occasional sudden earthquake failure is referred to as the Elastic-rebound theory. It is estimated that only 10 percent or less of an earthquake's total energy is radiated as seismic energy. Most of the earthquake's energy is used to power the earthquake fracture growth or is converted into heat generated by friction. Therefore, earthquakes lower the Earth's available elastic potential energy and raise its temperature, though these changes are negligible compared to the conductive and convective flow of heat out from the Earth's deep interior.
110. Homeopathy
Homeopathy is a form of alternative medicine that treats patients with heavily diluted preparations that are thought to cause effects similar to the symptoms presented, first expounded by German physician Samuel Hahnemann in 1796. Homeopathic remedies are prepared by serial dilution with shaking by forceful striking ("succussing") after each dilution under the assumption that this increases the effect of the treatment; this process is referred to as "potentization". Dilution often continues until none of the original substance remains.
Apart from the symptoms of the disease, homeopaths use aspects of the patient's physical and psychological state in recommending remedies. Homeopathic reference books known as repertories are then consulted, and a remedy is selected based on the index of symptoms. Homeopathic remedies are generally considered safe, with rare exceptions. However, homeopaths have been criticized for putting patients at risk with advice to avoid conventional medicine, such as vaccinations, anti-malarial drugs, and antibiotics. In many countries, the laws that govern the regulation and testing of conventional drugs do not apply to homeopathic remedies.
Claims of homeopathy's efficacy beyond the placebo effect are unsupported by the collective weight of scientific and clinical evidence. Supporters claim that studies published in reputable journals support the efficacy of homeopathy; however, there are only a handful of them, they are not definitive and they have not been replicated. Several high-quality studies exist showing no evidence for any effect from homeopathy, and the few positive studies of homeopathic remedies have generally been shown to have problems that prevent them from being considered unambiguous evidence for homeopathy's efficacy.
Homeopathic remedies generally contain few or no pharmacologically active ingredients, and for such remedies to have pharmacological effect would violate fundamental principles of science. Modern homeopaths have proposed that water has a memory that allows homeopathic preparations to work without any of the original substance; however, the physics of water are well understood, and no known mechanism permits such a memory. The lack of convincing scientific evidence supporting homeopathy's efficacy and its use of remedies lacking active ingredients have caused homeopathy to be described as pseudoscience and quackery.
111. Visual communication
Visual communication as the name suggests is communication through visual aid. It is the conveyance of ideas and information in forms that can be read or looked upon. Primarily associated with two dimensional images, it includes: signs, typography, drawing, graphic design, illustration, color and electronic resources. It solely relies on vision. It is form of communication with visual effect. It explores the idea that a visual message with text has a greater power to inform, educate or persuade a person. It is communication by presenting information through Visual form.
The evaluation of a good visual design is based on measuring comprehension by the audience, not on aesthetic or artistic preference. There are no universally agreed-upon principles of beauty and ugliness. There exists a variety of ways to present information visually, like gestures, body languages, video and TV. Here, focus is on the presentation of text, pictures, diagrams, photos, et cetera, integrated on a computer display. The term visual presentation is used to refer to the actual presentation of information. Recent research in the field has focused on web design and graphically oriented usability. Graphic designers use methods of visual communication in their professional practice.
Visual communication on the World Wide Web is perhaps the most important form of communication taking place when users are surfing the Internet. When experiencing the web, one uses the eyes as the primary sense and therefore the visual dísplay of a website is important for the users understanding of the communication taking place.
Visual communication by e-mail, a textual medium, is commonly approximated with ASCII art, emoticons, and embedded digital images.
112. Telecommunication
Telecommunication is the assisted transmission over a distance for the purpose of communication. In earlier times, this may have involved the use of smoke signals, drums, semaphore, flags or heliograph. In modern times, telecommunication typically involves the use of electronic devices such as the telephone, television, radio or computer. Early inventors in the field of telecommunication include Alexander Graham Bell, Guglielmo Marconi and John Logie Baird. Telecommunication is an important part of the world economy and the telecommunication industry's revenue was estimated to be $1.2 trillion in 2006.
In an analogue telephone network, the caller is connected to the person he wants to talk to by switches at various telephone exchanges. The switches form an electrical connection between the two users and the setting of these switches is determined electronically when the caller dials the number. Once the connection is made, the caller's voice is transformed to an electrical signal using a small microphone in the caller's handset. This electrical signal is then sent through the network to the user at the other end where it is transformed back into sound by a small speaker in that person's handset. There is a separate electrical connection that works in reverse, allowing the users to converse.
The fixed-line telephones in most residential homes are analogue — that is, the speaker's voice directly determines the signal's voltage. Although short-distance calls may be handled from end-to-end as analogue signals, increasingly telephone service providers are transparently converting the signals to digital for transmission before converting them back to analogue for reception. The advantage of this is that digitized voice data can travel side-by-side with data from the Internet and can be perfectly reproduced in long distance communication (as opposed to analogue signals that are inevitably impacted by noise).
Mobile phones have had a significant impact on telephone networks. Mobile phone subscriptions now outnumber fixed-line subscriptions in many markets. Sales of mobile phones in 2005 totalled 816.6 million with that figure being almost equally shared amongst the markets of Asia/Pacific (204 m), Western Europe (164 m), CEMEA (Central Europe, the Middle East and Africa) (153.5 m), North America (148 m) and Latin America (102 m). In terms of new subscriptions over the five years from 1999, Africa has outpaced other markets with 58.2% growth. Increasingly these phones are being serviced by systems where the voice content is transmitted digitally such as GSM or W-CDMA with many markets choosing to depreciate analogue systems such as AMPS.
There have also been dramatic changes in telephone communication behind the scenes. Starting with the operation of TAT-8 in 1988, the 1990s saw the widespread adoption of systems based on optic fibers. The benefit of communicating with optic fibres is that they offer a drastic increase in data capacity. TAT-8 itself was able to carry 10 times as many telephone calls as the last copper cable laid at that time and today's optic fibre cables are able to carry 25 times as many telephone calls as TAT-8. This increase in data capacity is due to several factors: First, optic fibres are physically much smaller than competing technologies. Second, they do not suffer from crosstalk which means several hundred of them can be easily bundled together in a single cable. Lastly, improvements in multiplexing have led to an exponential growth in the data capacity of a single fibre.
Assisting communication across many modern optic fibre networks is a protocol known as Asynchronous Transfer Mode (ATM). The ATM protocol allows for the side-by-side data transmission mentioned in the second paragraph. It is suitable for public telephone networks because it establishes a pathway for data through the network and associates a traffic contract with that pathway. The traffic contract is essentially an agreement between the client and the network about how the network is to handle the data; if the network cannot meet the conditions of the traffic contract it does not accept the connection. This is important because telephone calls can negotiate a contract so as to guarantee themselves a constant bit rate, something that will ensure a caller's voice is not delayed in parts or cut-off completely.[47] There are competitors to ATM, such as Multiprotocol Label Switching (MPLS), that perform a similar task and are expected to supplant ATM in the future.
113. Hard disk drive
A hard disk drive (often shortened as "hard disk", "hard drive", or "HDD"), is a non-volatile storage device which stores digitally encoded data on rapidly rotating platters with magnetic surfaces. Strictly speaking, "drive" refers to a device distinct from its medium, such as a tape drive and its tape, or a floppy disk drive and its floppy disk. Early HDDs had removable media; however, an HDD today is typically a sealed unit (except for a filtered vent hole to equalize air pressure) with fixed media.
The platters are spun at very high speeds. Information is written to a platter as it rotates past devices called read-and-write heads that operate very close (tens of nanometers in new drives) over the magnetic surface. The read-and-write head is used to detect and modify the magnetization of the material immediately under it. There is one head for each magnetic platter surface on the spindle, mounted on a common arm. An actuator arm (or access arm) moves the heads on an arc (roughly radially) across the platters as they spin, allowing each head to access almost the entire surface of the platter as it spins. The arm is moved using a voice coil actuator or in some older designs a stepper motor.
The magnetic recording media are CoCrPt-based magnetic thin films of about 10-20 nm in thickness. The thin films are normally deposited on glass/ceramic/metal substrate and covered by thin carbon layer for protection. The Co-based alloy thin films are polycrystalline and the size of grains has an order of 10 nm. Because the sizes of each grain are tiny, they are typical single domain magnets. The media are magnetically hard (coercivity is about 0.3T) so that a stable remnant magnetization can be achieved. The grain boundaries turn out to be very important. The reason is that, the grains are very small and close to each other, so the coupling between each grain is very strong. When one grain is magnetized, the adjacent grains tend to be aligned parallel to it or demagnetized. Then both the stability of the data and signal-to-noise ratio will be sabotaged. A clear grain boundary can weaken the coupling of the grains and subsequently increase the signal-to-noise ratio. During writing process, ideally one grain can store one bit (1/0).
However, current technology can not reach that far yet. In practice, a group of grains (about 100) are magnetized as one bit. So, in order to increase the data density, smaller grains are required. From microstructure point of view, longitudinal and perpendicular recording are the same. Also, similar Co-based thin films are used in both longitudinal and perpendicular recording. However, the fabrication processes are different to gain different crystal structure and magnetic properties. In longitudinal recording, the single-domain grains have uniaxial anisotropy with easy axes lying in the film plane. The consequence of this arrangement is that adjacent magnets repel each other. Therefore the magnetostatic energy is so large that it is difficult to increase areal density. Perpendicular recording media, on the other hand, has the easy axis of the grains oriented perpendicular to the disk plane. Adjacent magnets attract to each other and magnetostatic energy are much lower. So, much higher areal density can be achieved in perpendicular recording. Another unique feature in perpendicular recording is that a soft magnetic underlayer is incorporated into the recording disk. This underlayer is used to conduct writing magnetic flux so that the writing is more efficient. This will be discussed in writing process. Therefore, a higher anisotropy medium film, such as L10-FePt and rare-earth magnets, can be used.
114. Card reader
A memory card reader is a device used for communication with a smart card or a flash memory card. A business card reader is a scanning device used to scan and electronically save business cards. A magnetic card reader is a device used to scan cards containing magnetic data strips. A punched card reader is a device used to read holes in punched cardboard cards.
A smart card reader is an electronic device that reads smart cards. Some keyboards have a built-in card reader. There are external devices and internal drive bay card reader devices for PC. Some laptops have built-in smart card reader. Some have a flash upgradeable firmware. The card reader supplies the integrated circuit on the smart card with electricity. Communication is done via protocols and you can read and write to a fixed address on the card.
A barcode is a series of alternating dark and light stripes that are read by an optical scanner. The organization and width of the lines is determined by the bar code protocol selected. There are many different protocols but Code 39 is the most popular in the security industry. Sometimes the digits represented by the dark and light bars are also printed to allow people to read the number without an optical reader. The advantage of using bar code technology is that it is cheap and easy to generate the credential and it can easily be applied to cards or other items. The disadvantage of this technology is that it is cheap and easy to generate a credential making the technology susceptible to fraud and the optical reader can have reliability problems with dirty or smudged credentials. One attempt to reduce fraud is to print the bar code using carbon-based ink and then cover the bar code with a dark red overlay. The bar code can then be read with an optical reader tuned to the infrared spectrum, but can not easily be copied by a copy machine. This does not address the ease with which bar code numbers can be generated from a computer using almost any printer.
There are several forms of biometric identification employed in access control: fingerprint, hand geometry, iris and face recognition. The use of biometric technology significantly increases security level of systems because it eliminates such problems as lost, stolen or loaned ID cards, and forgotten or guessed PINs. The operation of all biometric readers is alike: they compare the template stored in memory to the scan obtained during the process of identification. If the probability that the template in the memory and the live scan belong to the same person is high enough, the ID number of that person is sent to a control panel. The control panel then checks permissions of the user and makes the decision whether to grant access or not. The communication between the reader and the control panel is usually done in the industry standard Wiegand protocol. The only exception is intelligent biometric readers that do not require any panels and directly control all door hardware.
115. Random-access memory
Random-access memory (usually known by its acronym, RAM) is a form of computer data storage. Today, it takes the form of integrated circuits that allow stored data to be accessed in any order (i.e., at random). The word random thus refers to the fact that any piece of data can be returned in a constant time, regardless of its physical location and whether or not it is related to the previous piece of data.
By contrast, storage devices such as tapes, magnetic discs and optical discs rely on the physical movement of the recording medium or a reading head. In these devices, the movement takes longer than data transfer, and the retrieval time varies based on the physical location of the next item. The word RAM is often associated with volatile types of memory (such as DRAM memory modules), where the information is lost after the power is switched off. Many other types of memory are RAM, too, including most types of ROM and flash memory called NOR-Flash.
Modern types of writable RAM generally store a bit of data in either the state of a flip-flop, as in SRAM (static RAM), or as a charge in a capacitor (or transistor gate), as in DRAM (dynamic RAM), EPROM, EEPROM and Flash. Some types have circuitry to detect and/or correct random faults called memory errors in the stored data, using parity bits or error correction codes. RAM of the read-only type, ROM, instead uses a metal mask to permanently enable/disable selected transistors, instead of storing a charge in them.
As both SRAM and DRAM are volatile, other forms of computer storage, such as disks and magnetic tapes, have been used as persistent storage in traditional computers. Many newer products instead rely on flash memory to maintain data when not in use, such as PDAs or small music players. Certain personal computers, such as many rugged computers and netbooks, have also replaced magnetic disks with flash drives. With flash memory, only the NOR type is capable of true random access, allowing direct code execution, and is therefore often used instead of ROM; the lower cost NAND type is commonly used for bulk storage in memory cards and solid-state drives.
Similar to a microprocessor, a memory chip is an integrated circuit (IC) made of millions of transistors and capacitors. In the most common form of computer memory, dynamic random access memory (DRAM), a transistor and a capacitor are paired to create a memory cell, which represents a single bit of data. The capacitor holds the bit of information -- a 0 or a 1 . The transistor acts as a switch that lets the control circuitry on the memory chip read the capacitor or change its state.
116. Blu-ray Disc
Blu-ray Disc (also known as Blu-ray or BD) is an optical disc storage medium designed/created by Sony to supersede the standard DVD format. Its main uses are for storing Playstation 3 games, high-definition video and data storage with 50GB per disc. The disc has the same physical dimensions as standard DVDs and CDs.
The name Blu-ray Disc derives from the blue-violet laser used to read the disc. While a standard DVD uses a 650 nanometre red laser, Blu-ray uses a shorter wavelength, a 405 nm blue-violet laser, and allows for almost six times more data storage than on a DVD.
During the format war over high-definition optical discs, Blu-ray competed with the HD DVD format. Toshiba, the main company supporting HD DVD, ceded in February 2008 and the format war ended.
Blu-ray Disc is developed by the Blu-ray Disc Association, a group representing makers of consumer electronics, computer hardware, and motion pictures. As of June 2009, more than 1000 Blu-ray disc titles are available in Australia, 2500 in Japan, 1500 in the United Kingdom
At the 2005 JavaOne trade show, it was announced that Sun Microsystems' Java cross-platform software environment would be included in all Blu-ray Disc players as a mandatory part of the standard. Java is used to implement interactive menus on Blu-ray Discs, as opposed to the method used on DVD video discs, which uses pre-rendered MPEG segments and selectable subtitle pictures, which is considerably more primitive and rarely seamless. Java creator James Gosling, at the conference, suggested that the inclusion of a Java Virtual Machine as well as network connectivity in some BD devices will allow updates to Blu-ray Discs via the Internet, adding content such as additional subtitle languages and promotional features that are not included on the disc at pressing time. This Java Version is called BD-J and is a subset of the Globally Executable MHP (GEM) standard. GEM is the worldwide version of the Multimedia Home Platform standard. Most Blu-ray Discs which have BD-J menus do not allow a Blu-ray Disc player to automatically resume a movie from the point where the movie was stopped.
117. DVD
DVD, also known as "Digital Versatile Disc" or "Digital Video Disc," is an optical disc storage media format. Its main uses are video and data storage. DVDs are of the same dimensions as compact discs (CDs) but store more than six times as much data. Variations of the term DVD often describe the way data is stored on the discs: DVD-ROM (Read Only Memory), has data that can only be read and not written, DVD-R and DVD+R can record data only once and then function as a DVD-ROM. DVD-RW, DVD+RW and DVD-RAM can both record and erase data multiple times. The wavelength used by standard DVD lasers is 650 nm,[1] and thus the light has a red color.
DVD-Video and DVD-Audio discs respectively refer to properly formatted and structured video and audio content. Other types of DVDs, including those with video content, may be referred to as DVD-Data discs. As next generation High Definition more advanced optical formats such as Blu-ray Disc also use a disc identical in some aspects, the original DVD is occasionally given the retronym SD DVD (for standard definition). However, the trademarked HD DVD discs have been discontinued since Blu-ray absorbed their market share.
Dual-layer recording allows DVD-R and DVD+R discs to store significantly more data, up to 8.54 gigabytes per disc, compared with 4.7 gigabytes for single-layer discs. DVD-R DL was developed for the DVD Forum by Pioneer Corporation; DVD+R DL was developed for the DVD+RW Alliance by Philips and Mitsubishi Kagaku Media (MKM).
A dual-layer disc differs from its usual DVD counterpart by employing a second physical layer within the disc itself. The drive with dual-layer capability accesses the second layer by shining the laser through the first semitransparent layer. In some DVD players, the layer change can exhibit a noticeable pause, up to several seconds. This caused some viewers to worry that their dual-layer discs were damaged or defective, with the end result that studios began listing a standard message explaining the dual-layer pausing effect on all dual-layer disc packaging.
DVD recordable discs supporting this technology are backward compatible with some existing DVD players and DVD-ROM drives. Many current DVD recorders support dual-layer technology, and the price is now comparable to that of single-layer drives, although the blank media remains more expensive. The recording speeds reached by dual-layer media are still well below those of single-layer media.
There are two modes for dual-layer orientation. With parallel track path (PTP), used on DVD-ROM, both layers start at the inside diameter (ID) and end at the outside diameter (OD) with the lead-out. With opposite track path (OTP), used on many DVD-Video discs, the lower layer starts at the ID and the upper layer starts at the OD, where the other layer ends; they share one lead-in and one lead-out. However, some DVD-Video discs also use a parallel track - such as those authored episodically, such as a disc with several separate episodes of a TV series, where more often than not, the layer change is in-between titles and therefore would not need to be authored in the opposite track path fashion.
118. Animation
Animation is the rapid display of a sequence of images of 2-D or 3-D artwork or model positions in order to create an illusion of movement. It is an optical illusion of motion due to the phenomenon of persistence of vision, and can be created and demonstrated in a number of ways. The most common method of presenting animation is as a motion picture or video program, although several other forms of presenting animation also exist.
Traditional animation was the process used for most animated films of the 20th century. The individual frames of a traditionally animated film are photographs of drawings, which are first drawn on paper. To create the illusion of movement, each drawing differs slightly from the one before it. The animators' drawings are traced or photocopied onto transparent acetate sheets called cels, which are filled in with paints in assigned colors or tones on the side opposite the line drawings. The completed character cels are photographed one-by-one onto motion picture film against a painted background by a rostrum camera.
The traditional cel animation process became obsolete by the beginning of the 21st century. Today, animators' drawings and the backgrounds are either scanned into or drawn directly into a computer system. Various software programs are used to color the drawings and simulate camera movement and effects. The final animated piece is output to one of several delivery mediums, including traditional 35 mm film and newer media such as digital video. The "look" of traditional cel animation is still preserved, and the character animators' work has remained essentially the same over the past 70 years. Some animation producers have used the term "tradigital" to describe cel animation which makes extensive use of computer technology.
3D animation digital models manipulated by an animator. In order to manipulate a mesh, it is given a digital armature (sculpture). This process is called rigging. Various other techniques can be applied, such as mathematical functions (ex. gravity, particle simulations), simulated fur or hair, effects such as fire and water and the use of Motion capture to name but a few. Many 3D animations are very believable and are commonly used as special effects for recent movies.
119. Cryptography
Cryptography (or cryptology; from Greek kryptos, "hidden, secret"; and gráphō, "I write", -logia, respectively) is the practice and study of hiding information. In modern times cryptography is considered a branch of both mathematics and computer science and is affiliated closely with information theory, computer security and engineering. Cryptography is used in applications present in technologically advanced societies; examples include the security of ATM cards, computer passwords and electronic commerce, which all depend on cryptography.
Before the modern era, cryptography was concerned solely with message confidentiality (i.e., encryption) — conversion of messages from a comprehensible form into an incomprehensible one and back again at the other end, rendering it unreadable by interceptors or eavesdroppers without secret knowledge (namely the key needed for decryption of that message). In recent decades, the field has expanded beyond confidentiality concerns to include techniques for message integrity checking, sender/receiver identity authentication, digital signatures, and interactive proofs and secure computation, among others.
The earliest forms of secret writing required little more than local pen and paper analogs, as most people could not read. More literacy, or opponent literacy, required actual cryptography. The main classical cipher types are transposition ciphers, which rearrange the order of letters in a message (e.g., 'hello world' becomes 'ehlol owrdl' in a trivially simple rearrangement scheme), and substitution ciphers, which systematically replace letters or groups of letters with other letters or groups of letters (e.g., 'fly at once' becomes by replacing each letter with the one following it in the English alphabet). Simple versions of either offered little confidentiality from enterprising opponents, and still don't. An early substitution cipher was the Caesar cipher, in which each letter in the plaintext was replaced by a letter some fixed number of positions further down the alphabet. It was named after Julius Caesar who is reported to have used it, with a shift of 3, to communicate with his generals during his military campaigns, just like EXCESS-3 code in Boolean algebra.
Encryption attempts to ensure secrecy in communications, such as those of spies, military leaders, and diplomats. There is record of several early Hebrew ciphers as well. Cryptography is recommended in the Kama Sutra as a way for lovers to communicate without inconvenient discovery. Steganography (i.e., hiding even the existence of a message so as to keep it confidential) was also first developed in ancient times. An early example, from Herodotus, concealed a message - a tattoo on a slave's shaved head - under the regrown hair. More modern examples of steganography include the use of invisible ink, microdots, and digital watermarks to conceal information.
120. Network security
Network security consists of the provisions made in an underlying computer network infrastructure, policies adopted by the network administrator to protect the network and the network-accessible resources from unauthorized access, and consistent and continuous monitoring and measurement of its effectiveness (or lack) combined together.
The terms network security and information security are often used interchangeably, however network security is generally taken as providing protection at the boundaries of an organization, keeping the intruders (e.g. black hat hackers, script kiddies, etc.) out. Network security systems today are mostly effective, so the focus has shifted to protecting resources from attack or simple mistakes by people inside the organization, e.g. with Data Loss Prevention (DLP). One response to this insider threat in network security is to compartmentalize large networks, so that an employee would have to cross an internal boundary and be authenticated when they try to access privileged information. Information security is explicitly concerned with all aspects of protecting information resources, including network security and DLP.
Network security starts from authenticating any user, commonly (one factor authentication) with a username and a password (something you know). With two factor authentication something you have is also used (e.g. a security token or 'dongle', an ATM card, or your mobile phone), or with three factor authentication something you are also used (e.g. a fingerprint or retinal scan). Once authenticated, a stateful firewall enforces access policies such as what services are allowed to be accessed by the network users.Though effective to prevent unauthorized access, this component fails to check potentially harmful content such as computer worms being transmitted over the network. An intrusion prevention system (IPS) helps detect and inhibit the action of such malware. An anomaly-based intrusion detection system also monitors network traffic for suspicious content, unexpected traffic and other anomalies to protect the network e.g. from denial of service attacks or an employee accessing files at strange times. Communication between two hosts using the network could be encrypted to maintain privacy. Individual events occurring on the network could be tracked for audit purposes and for a later high level analysis.
Honeypots, essentially decoy network-accessible resources, could be deployed in a network as surveillance and early-warning tools. Techniques used by the attackers that attempt to compromise these decoy resources are studied during and after an attack to keep an eye on new exploitation techniques. Such analysis could be used to further tighten security of the actual network being protected by the honeypot.
121. Honeypot (computing)
In computer terminology, a honeypot is a trap set to detect, deflect, or in some manner counteract attempts at unauthorized use of information systems. Generally it consists of a computer, data, or a network site that appears to be part of a network but which is actually isolated, protected, and monitored, and which seems to contain information or a resource that would be of value to attackers.
A honeypot is valuable as a surveillance and early-warning tool. While it is often a computer, a honeypot can take on other forms, such as files or data records, or even unused IP address space. A honeypot that masquerades as an open proxy in order to monitor and record the activities of those using the system is known as a sugarcane. Honeypots should have no production value and hence should not see any legitimate traffic or activity. Whatever they capture can then be surmised as malicious or unauthorized. One very practical implication of this is that honeypots designed to thwart spam by masquerading as systems of the types abused by spammers to send spam can categorize the material they trap 100% accurately: it is all illicit.
Honeypots can carry risks to a network, and must be handled with care. If they are not properly walled off, an attacker can use them to break into a system.
Victim hosts are an active network counter-intrusion tool. These computers run special software, designed to appear to an intruder as being important and worth looking into. In reality, these programs are dummies, and their patterns are constructed specifically to foster interest in attackers. The software installed on, and run by, victim hosts is dual purpose. First, these dummy programs keep a network intruder occupied looking for valuable information where none exists; effectively convincing him or her to isolate them in what is truly an unimportant part of the network. This decoy strategy is designed to keep an intruder from getting bored and heading into truly security-critical systems. The second part of the victim host strategy is intelligence gathering. Once an intruder has broken into the victim host, the machine or a network administrator can examine the intrusion methods used by the intruder. This intelligence can be used to build specific countermeasures to intrusion techniques, making truly important systems on the network less vulnerable to intrusion.
Production honeypots are easy to use, capture only limited information, and are used primarily by companies or corporations; Production honeypots are placed inside the production network with other production servers by organization to improve their overall state of security. Normally, production honeypots are low-interaction honeypots which are easier to deploy. They give less information about the attacks or attackers than research honeypots do. The purpose of a production honeypot is to help mitigate risk in an organization. The honeypot adds value to the security measures of an organization.
Research honeypots are run by a volunteer, non-profit research organization or an educational institution to gather information about the motives and tactics of the Blackhat community targeting different networks. These honeypots do not add direct value to a specific organization. Instead they are used to research the threats organizations face, and to learn how to better protect against those threats. This information is then used to protect against those threats. Research honeypots are complex to deploy and maintain, capture extensive information, and are used primarily by research, military, or government organizations.
122. E-mail spam
E-mail spam, also known as junk e-mail, is a subset of spam that involves nearly identical messages sent to numerous recipients by e-mail. A common synonym for spam is unsolicited bulk e-mail (UBE). Definitions of spam usually include the aspects that email is unsolicited and sent in bulk. "UCE" refers specifically to unsolicited commercial e-mail.
E-mail spam has steadily, even exponentially grown since the early 1990s to several billion messages a day. Spam has frustrated, confused, and annoyed e-mail users. Laws against spam have been sporadically implemented, with some being opt-out and others requiring opt in e-mail. The total volume of spam (over 100 billion emails per day as of April 2008) has leveled off slightly in recent years, and is no longer growing exponentially. The amount received by most e-mail users has decreased, mostly because of better filtering. About 80% of all spam is sent by fewer than 200 spammers. Botnets, networks of virus-infected computers, are used to send about 80% of spam. Since the cost of the spam is borne mostly by the recipient, it is effectively postage due advertising.
E-mail addresses are collected from chatrooms, websites, newsgroups, and viruses which harvest users' address books, and are sold to other spammers. Much of spam is sent to invalid e-mail addresses. ISPs have attempted to recover the cost of spam through lawsuits against spammers, although they have been mostly unsuccessful in collecting damages despite winning in court. Spam averages 94% of all e-mail sent.
Blank spam is spam lacking a payload advertisement. Often the message body is missing altogether, as well as the subject line. Still, it fits the definition of spam because of its nature as bulk and unsolicited email. Blank spam may be originated in different ways, either intentional or unintentionally: Blank spam can have been sent in a directory harvest attack, a form of dictionary attack for gathering valid addresses from an email service provider. Since the goal in such an attack is to use the bounces to separate invalid addresses from the valid ones, the spammer may dispense with most elements of the header and the entire message body, and still accomplish his or her goals. Blank spam may also occur when a spammer forgets or otherwise fails to add the payload when he or she sets up the spam run. Often blank spam headers appear truncated, suggesting that computer glitches may have contributed to this problem—from poorly-written spam software to shoddy relay servers, or any problems that may truncate header lines from the message body. Some spam may appear to be blank when in fact it is not. An example of this is the VBS.Davinia.B email worm which propagates through messages that have no subject line and appears blank, when in fact it uses HTML code to download other files.
123. Software testing
Software Testing is an empirical investigation conducted to provide stakeholders with information about the quality of the product or service under test, with respect to the context in which it is intended to operate. Software Testing also provides an objective, independent view of the software to allow the business to appreciate and understand the risks at implementation of the software. Test techniques include, but are not limited to, the process of executing a program or application with the intent of finding software bugs. It can also be stated as the process of validating and verifying that a software program/application/product meets the business and technical requirements that guided its design and development, so that it works as expected and can be implemented with the same characteristics.
Software Testing, depending on the testing method employed, can be implemented at any time in the development process, however the most test effort is employed after the requirements have been defined and coding process has been completed.
Testing can never completely identify all the defects within software. Instead, it furnishes a criticism or comparison that compares the state and behavior of the product against oracles—principles or mechanisms by which someone might recognize a problem. These oracles may include (but are not limited to) specifications, contracts, comparable products, past versions of the same product, inferences about intended or expected purpose, user or customer expectations, relevant standards, applicable laws, or other criteria.
Every software product has a target audience. For example, the audience for video game software is completely different from banking software. Therefore, when an organization develops or otherwise invests in a software product, it can assess whether the software product will be acceptable to its end users, its target audience, its purchasers, and other stakeholders, Software testing is the process of attempting to make this assessment.
A study conducted by NIST in 2002 reports that software bugs cost. economy $59.5 billion annually. More than a third of this cost could be avoided if better software testing was performed.
A primary purpose for testing is to detect software failures so that defects may be uncovered and corrected. This is a non-trivial pursuit. Testing cannot establish that a product functions properly under all conditions but can only establish that it does not function properly under specific conditions. The scope of software testing often includes examination of code as well as execution of that code in various environments and conditions as well as examining the aspects of code: does it do what it is supposed to do and do what it needs to do. In the current culture of software development, a testing organization may be separate from the development team. There are various roles for testing team members. Information derived from software testing may be used to correct the process by which software is developed.
124. Compiler
A compiler is a computer program (or set of programs) that transforms source code written in a computer language (the source language) into another computer language (the target language, often having a binary form known as object code). The most common reason for wanting to transform source code is to create an executable program.
The name "compiler" is primarily used for programs that translate source code from a high-level programming language to a lower level language (e.g., assembly language or machine code). A program that translates from a low level language to a higher level one is a decompiler. A program that translates between high-level languages is usually called a language translator, source to source translator, or language converter. A language rewriter is usually a program that translates the form of expressions without a change of language. A compiler is likely to perform many or all of the following operations: lexical analysis, preprocessing, parsing, semantic analysis, code generation, and code optimization. Program faults caused by incorrect compiler behavior can be very difficult to track down and work around and compiler implementers invest a lot of time ensuring the correctness of their software. The term compiler-compiler is sometimes used to refer to a parser generator, a tool often used to help create a compiler.
Higher-level programming languages are generally divided for convenience into compiled languages and interpreted languages. However, in practice there is rarely anything about a language that requires it to be exclusively compiled, or exclusively interpreted; although it is possible to design languages that may be inherently interpretive. The categorization usually reflects the most popular or widespread implementations of a language — for instance, BASIC are sometimes called an interpreted language and C a compiled one, despite the existence of BASIC compilers and C interpreters.
The output of some compilers may target hardware at a very low level, for example a Field Programmable Gate Array (FPGA) or structured Application-specific integrated circuit (ASIC). Such compilers are said to be hardware compilers or synthesis tools because the programs they compile effectively control the final configuration of the hardware and how it operates; the output of the compilation are not instructions that are executed in sequence - only an interconnection of transistors or lookup tables. For example, XST is the Xilinx Synthesis Tool used for configuring FPGAs. Similar tools are available from Altera, Synplicity, Synopsys and other vendors.
125. Debugger
A debugger is a computer program that is used to test and debug other programs. The code to be examined might alternatively be running on an instruction set simulator (ISS), a technique that allows great power in its ability to halt when specific conditions are encountered but which will typically be much slower than executing the code directly on the appropriate processor.
When the program crashes, the debugger shows the position in the original code if it is a source-level debugger or symbolic debugger, commonly seen in integrated development environments. If it is a low-level debugger or a machine-language debugger it shows the line in the disassembly. (A "crash" happens when the program cannot continue because of a programming bug. For example, perhaps the program tried to use an instruction not available on the current version of the CPU or attempted access to unavailable or protected memory.)
Typically, debuggers also offer more sophisticated functions such as running a program step by step (single-stepping), stopping (breaking) (pausing the program to examine the current state) at some kind of event by means of breakpoint, and tracking the values of some variables. Some debuggers have the ability to modify the state of the program while it is running, rather than merely to observe it.
The importance of a good debugger cannot be overstated. Indeed, the existence and quality of such a tool for a given language and platform can often be the deciding factor in its use, even if another language/platform is better-suited to the task. However, it is also important to note that software can (and often does) behave differently running under a debugger than normally, due to the inevitable changes the presence of a debugger will make to a software program's internal timing. As a result, even with a good debugging tool, it is often very difficult to track down runtime problems in complex multi-threaded or distributed systems.
The same functionality which makes a debugger useful for eliminating bugs allows it to be used as a software cracking tool to evade copy protection, digital rights management, and other software protection features.
126. Emulator
An emulator duplicates (provides an emulation of) the functions of one system using a different system, so that the second system behaves like (and appears to be) the first system. This focus on exact reproduction of external behavior is in contrast to some other forms of computer simulation, which can concern an abstract model of the system being simulated.
Emulation refers to the ability of a computer program or electronic device to imitate another program or device. Many printers, for example, are designed to emulate Hewlett-Packard LaserJet printers because so much software is written for HP printers. By emulating an HP printer, a printer can work with any software written for a real HP printer. Emulation "tricks" the running software into believing that a device is really some other device.
A hardware emulator is an emulator which takes the form of a hardware device. Examples include the DOS-compatible card installed in some old-world Macintoshes like Centris 610 or Performa 630 that allowed them to run PC programs and FPGA-based hardware emulators.
In a theoretical sense, the Church-Turing thesis implies that any operating environment can be emulated within any other. However, in practice, it can be quite difficult, particularly when the exact behavior of the system to be emulated is not documented and has to be deduced through reverse engineering. It also says nothing about timing constraints; if the emulator does not perform as quickly as the original hardware, the emulated software may run much more slowly than it would have on the original hardware, possibly triggering time interrupts to alter performance.
Most emulators just emulate hardware architecture — if operating system firmware or software is required for the desired software, it must be provided as well (and may it be emulated). Both the OS and the software will then be interpreted by the emulator, rather than being run by native hardware. Apart from this interpreter for the emulated binary machine's language, some other hardware (such as input or output devices) must be provided in virtual form as well; for example, if writing to a specific memory location should influence what is displayed on the screen, then this would need to be emulated.
127. Software developer
A software developer is a person or organization concerned with facets of the software development process wider than design and coding, a somewhat broader scope of computer programming or a specialty of project managing including some aspects of software product management. This person may contribute to the overview of the project on the application level rather than component level or individual programming tasks. Software developers are often still guided by lead programmers but also encompass the class of freelance software developers.
Other names which are often used in the same close context are software analyst and software engineer.
With time and a little luck, differences between system design, software development and programming are more apparent. Already in the current market place there can be found segregation between programmers and developers, being that one who actually implements is not the same as the one who designs the class structure or hierarchy. Even more so that developers become systems architects, those who design the multi-leveled architecture or component interactions of a large software system. (see also Debate over who is a software engineer)
A 'programmer' is responsible for writing code, but a 'developer' could be involved in wider aspects of the software development process such as:
Participation in software product definition, including Business case or Gap analysis
Specification
Requirements analysis
Development and refinement of throw-away simulations or prototypes to confirm requirements
Feasibility and Cost-benefit analysis, including the choice of application architecture and framework, leading to the budget and schedule for the project
Design
Implementation (e.g. installation, configuration, programming/customization, integration, data migration)
Authoring of documentation needed by users and implementation partners etc.
Testing, including defining/supporting acceptance testing and gathering feedback from pre-release testers
Participation in software release and post-release activities, including support for product launch evangelism (e.g. developing demonstrations and/or samples) and competitive analysis for subsequent product build/release cycles
Maintenance
In a large company there may be employees whose sole responsibility may consist of only one of the phases above. In smaller development environments, a few, or even a single individual might handle the complete process. In a small company, the typical involvement of software developers includes every step from initial specification of a project to the completed system. Typically it includes: 1. Initial Meeting - where requirements are discussed in detail. 2. Proposal - a proposal based on the initial conversation and recommendations on the best approach 3. Detailed Design - for most projects, there is usually more design work to clarify exactly how the system should work. 4. Update Financials & Agree Contract - if the requirements have changed during the detailed design process, this is the stage to update the project costs. 5. Development - software developers start work on the system. 6. Functionally Complete - at the end of the development system, a system is delivered which is “functionally complete”, but may need further testing to iron out any bugs. 7. System Completed - testing is complete, and the system is ready for use.
128. CPU cache
A CPU cache is a cache used by the central processing unit of a computer to reduce the average time to access memory. The cache is a smaller, faster memory which stores copies of the data from the most frequently used main memory locations. As long as most memory accesses are cached memory locations, the average latency of memory accesses will be closer to the cache latency than to the latency of main memory.
When the processor needs to read from or write to a location in main memory, it first checks whether a copy of that data is in the cache. If so, the processor immediately reads from or writes to the cache, which is much faster than reading from or writing to main memory.
The diagram on the right shows two memories. Each location in each memory has a datum (a cache line), which in different designs ranges in size from 8 to 512 bytes. The size of the cache line is usually larger than the size of the usual access requested by a CPU instruction, which ranges from 1 to 16 bytes. Each location in each memory also has an index, which is a unique number used to refer to that location. The index for a location in main memory is called an address. Each location in the cache has a tag that contains the index of the datum in main memory that has been cached. In a CPU's data cache these entries are called cache lines or cache blocks.
Most modern desktop and server CPUs have at least three independent caches: an instruction cache to speed up executable instruction fetch, a data cache to speed up data fetch and store, and a translation look aside buffer used to speed up virtual-to-physical address translation for both executable instructions and data.
When the processor needs to read or write a location in main memory, it first checks whether that memory location is in the cache. This is accomplished by comparing the address of the memory location to all tags in the cache that might contain that address. If the processor finds that the memory location is in the cache, we say that a cache hit has occurred; otherwise we speak of a cache miss. In the case of a cache hit, the processor immediately reads or writes the data in the cache line. The proportion of accesses that result in a cache hit is known as the hit rate, and is a measure of the effectiveness of the cache.
129. Jewelery design
Jewelry design is the art or profession of creating, crafting, fabricating, or rendering designs for jewelry. This is an ancient practice of the goldsmith or metalworker that evolved to a billion-dollar industry with the odyssey from ancient cultures into the machine age. Jewelry design falls under the category of what is commonly known as "functional art", being art that can be worn or used.
Before an article of jewelry is created, it is typically rendered by a jewelry designer, a professional who is trained in the architectural and functional knowledge of not only metallurgy but also design elements such as composition and wearability.1.
Once the article is rendered, the design is then constructed using the necessary materials for proper adaptation to the function of the object. For example, 24K Gold was used in ancient jewelry design because it was more accessible than silver as source material. Before the 1st century many civilizations also incorporated beads into jewelry. Once the discovery of gemstones and gem cutting became more readily available, 2. The art of jewelry ornamentation and design shifted. The earliest documented gemstone cut was done by Theophilus Presbyter (c.1070 - 1125). Who practiced and developed many applied arts and was a known goldsmith. Later, during the 14th Century, medieval lapidary technology evolved to include cabochons and cameos.
Early Jewelry design commissions were often constituted by nobility or the church to honor an event or as wearable ornamentation. Within the structure of early methods, enameling and repoussé became standard methods for creating ornamental wares to demonstrate wealth, position, or power. These early techniques created a specific complex design element that later would forge the baroque movement in jewelry design.
130. Embedded system
An embedded system is a special-purpose computer system designed to perform one or a few dedicated functions, often with real-time computing constraints. It is usually embedded as part of a complete device including hardware and mechanical parts. In contrast, a general-purpose computer, such as a personal computer, can do many different tasks depending on programming. Embedded systems control many of the common devices in use today.
Since the embedded system is dedicated to specific tasks, design engineers can optimize it, reducing the size and cost of the product, or increasing the reliability and performance. Some embedded systems are mass-produced, benefiting from economies of scale.
Physically, embedded systems range from portable devices such as digital watches and MP4 players, to large stationary installations like traffic lights, factory controllers, or the systems controlling nuclear power plants. Complexity varies from low, with a single microcontroller chip, to very high with multiple units, peripherals and networks mounted inside a large chassis or enclosure.
In general, "embedded system" is not an exactly defined term, as many systems have some element of programmability. For example, Handheld computers share some elements with embedded systems — such as the operating systems and microprocessors which power them — but are not truly embedded systems, because they allow different applications to be loaded and peripherals to be connected.
Consumer electronics include personal digital assistants (PDAs), mp3 players, mobile phones, videogame consoles, digital cameras, DVD players, GPS receivers, and printers. Many household appliances, such as microwave ovens, washing machines and dishwashers, are including embedded systems to provide flexibility, efficiency and features. Advanced HVAC systems use networked thermostats to more accurately and efficiently control temperature that can change by time of day and season. Home automation uses wired- and wireless-networking that can be used to control lights, climate, security, audio/visual, surveillance, etc., all of which use embedded devices for sensing and controlling.
131. Digital signal processing
Digital signal processing (DSP) is concerned with the representation of the signals by a sequence of numbers or symbols and the processing of these signals. Digital signal processing and analog signal processing are subfields of signal processing. DSP includes subfields like: audio and speech signal processing, sonar and radar signal processing, sensor array processing, spectral estimation, statistical signal processing, digital image processing, signal processing for communications, biomedical signal processing, seismic data processing, etc.
Since the goal of DSP is usually to measure or filter continuous real-world analog signals, the first step is usually to convert the signal from an analog to a digital form, by using an analog to digital converter. Often, the required output signal is another analog output signal, which requires a digital to analog converter. Even if this process is more complex than analog processing and has a discrete value range, the stability of digital signal processing thanks to error detection and correction and being less vulnerable to noise makes it advantageous over analog signal processing for many, though not all, applications.
DSP algorithms have long been run on standard computers, on specialized processors called digital signal processors (DSPs), or on purpose-built hardware such as application-specific integrated circuit (ASICs). Today there are additional technologies used for digital signal processing including more powerful general purpose microprocessors, field-programmable gate arrays (FPGAs), digital signal controllers (mostly for industrial apps such as motor control), and stream processors, among others.
In DSP, engineers usually study digital signals in one of the following domains: time domain (one-dimensional signals), spatial domain (multidimensional signals), frequency domain, autocorrelation domain, and wavelet domains. They choose the domain in which to process a signal by making an informed guess (or by trying different possibilities) as to which domain best represents the essential characteristics of the signal. A sequence of samples from a measuring device produces a time or spatial domain representation, whereas a discrete Fourier transform produces the frequency domain information that is the frequency spectrum. Autocorrelation is defined as the cross-correlation of the signal with itself over varying intervals of time or space.
132. Signals intelligence
Signals intelligence (often contracted to SIGINT) is intelligence-gathering by interception of signals, whether between people (i.e., COMINT or communications intelligence) or between machines (i.e., ELINT or electronic intelligence), or mixtures of the two. As sensitive information is often encrypted, signals intelligence often involves the use of cryptanalysis. However, traffic analysis—the study of who is signaling whom and in what quantity—can often produce valuable information, even when the messages themselves cannot be decrypted. See SIGINT by Alliances, Nations and Industries for the organization of SIGINT activities, and Signals intelligence operational platforms by nation for current collection systems, and SIGINT in Modern History from World War I to the present.
As a means of collecting intelligence, signals intelligence is a subset of intelligence collection management, which, in turn, is a subset of intelligence cycle management. Intercepting written but encrypted communications, and extracting information, probably did not wait long after the development of writing. A simple encryption system, for example, is the Caesar cipher. Electronic interception appeared as early as 1900, during the Boer War. The Boers had captured some British radios, and, since the British were the only people transmitting at the time, had signals rather obvious to intercept.
First, atmospheric conditions, sunspots, the target's transmission schedule and antenna characteristics, and other factors create uncertainty that a given signal intercept sensor will be able to "hear" the signal of interest, even with a geographically fixed target and an opponent making no attempt to evade interception. Basic countermeasures against interception include frequent changing of radio frequency, polarization, and other transmission characteristics. An intercept aircraft could not get off the ground if it had to carry antennas and receivers for every possible frequency and signal type to deal with such countermeasures.
Second, locating the transmitter's position is usually part of SIGINT. Triangulation and more sophisticated radio location techniques, such as time of arrival methods, require multiple receiving points at different locations. These receivers send location-relevant information to a central point, or perhaps to a distributed system in which all participate, such that the information can be correlated and a location computed.
133. Robotics
Robotics is the science and technology of robots, and their design, manufacture, and application. Stories of artificial helpers and companions likewise attempts to create them have a long history, but fully autonomous machines only appeared in the 20th century. The first digitally operated and programmable robot, the Unimate, was installed in 1961 to lift hot pieces of metal from a die casting machine and stack them. Today, commercial and industrial robots are in widespread use performing jobs cheaper or more accurately and reliably than humans. They are also employed for jobs which are too dirty, dangerous, or dull to be suitable for humans. Robots are widely used in manufacturing, assembly and packing, transport, earth and space exploration, surgery, weaponry, laboratory research, safety, and mass production of consumer and industrial goods.
The structure of a robot is usually mostly mechanical and can be called a kinematic chain (its functionality being similar to the skeleton of the human body). The chain is formed of links (its bones), actuators (its muscles), and joints which can allow one or more degrees of freedom. Most contemporary robots use open serial chains in which each link connects the one before to the one after it. These robots are called serial robots and often resemble the human arm. Some robots, such as the Stewart platform, use a closed parallel kinematical chain. Other structures, such as those that mimic the mechanical structure of humans, various animals, and insects, are comparatively rare. However, the development and use of such structures in robots is an active area of research (e.g. biomechanics). Robots used as manipulators have an end effectors mounted on the last link. This end effector can be anything from a welding device to a mechanical hand used to manipulate the environment.
Current robotic and prosthetic hands receive far less tactile information than the human hand. Recent research has developed a tactile sensor array that mimics the mechanical properties and touch receptors of human fingertips. The sensor array is constructed as a rigid core surrounded by conductive fluid contained by an elastomeric skin. Electrodes are mounted on the surface of the rigid core and are connected to an impedance-measuring device within the core. When the artificial skin touches an object the fluid path around the electrodes is deformed, producing impedance changes that map the forces received from the object. The researchers expect that an important function of such artificial fingertips will be adjusting robotic grip on held objects.
Robots also require navigation hardware and software in order to anticipate on their environment. In particular unforeseen events (e.g. people and other obstacles that are not stationary) can cause problems or collisions. Some highly advanced robots as ASIMO, EveR-1, Mien robot have particular good robot navigation hardware and software. Also, self-controlled car, Ernst Dickmanns' driverless car and the entries in the DARPA Grand Challenge are capable of sensing the environment well and make navigation decisions based on this information. Most of the robots include regular a GPS navigation device with waypoints, along with radar, sometimes combined with other sensor data such as LIDAR, video cameras, and inertial guidance systems for better navigation in between waypoints.
134. Robotic spacecraft
A robotic spacecraft is a spacecraft with no humans on board, that is usually under telerobotic control. A robotic spacecraft designed to make scientific research measurements is often called a space probe. Many space missions are more suited to telerobotic rather than crewed operation, due to lower cost and lower risk factors. In addition, some planetary destinations such as Venus or the vicinity of Jupiter are too hostile for human survival, given current technology. Outer planets such as Saturn, Uranus, and Neptune are too distant to reach with current crewed spaceflight technology, so telerobotic probes are the only way to explore them.
The first space mission, Sputnik 1, was an artificial satellite put into Earth orbit by the on 4 October 1957. On 3 November 1957, the Soviets orbited Sputnik 2, the first to carry a living animal into space – a dog.
The United achieved its first successful space probe launch with the orbit of Explorer 1 on 31 January 1958. Explorer 1 weighed less than 14 kilograms compared to 83.6 kg and 508.3 kg for Sputniks 1 and 2 respectively. Nonetheless, Explorer 1 detected a narrow band of radiation surrounding the Earth, named the Van Allen belts after the scientist whose equipment detected it. Only six other countries have successfully launched missions using their own vehicles: France (1965), Japan (1970), China (1970), the United Kingdom (1971), India (1981) and Israel (1988).
Most American space probe missions have been coordinated by the Jet Propulsion Laboratory, and European missions by the European Space Operations Centre, part of the European Space Agency (ESA). ESA has conducted relatively fewer space exploration missions in the past (one example is the Giotto mission, which encountered comet Halley), but have launched several interplanetary spacecraft in recent years (e.g. Rosetta space probe, Mars Express, Venus Express). ESA has, however, launched many spacecraft to carry out astronomy, and is a collaborator with NASA on the Hubble Space Telescope. There have been many successful Russian space missions. There have also been a few Japanese, Chinese and Indian missions.
The supplies of electric power on spacecraft come from photovoltaic (solar) cells or from a radioisotope thermoelectric generator. Other components of the subsystem include batteries for storing power and distribution circuitry that connects components to the power sources. Spacecraft are often protected from temperature fluctuations with insulation. Some spacecraft use mirrors and sunshades for additional protection from solar heating. They also often need shielding from micrometeoroids and orbital debris.
135. Bing (search engine)
Bing (formerly Live Search, Windows Live Search and MSN Search) is a web search engine (advertised as a "decision" engine [1]), Microsoft's current incarnation of its search technology. Unveiled by Microsoft CEO Steve Ballmer on May 28, 2009 at the All Things Digital conference in San Diego, Bing is a replacement for Live Search. It went fully online on June 3, 2009.[2]
Notable changes include the listing of search suggestions in real time as queries are entered, and a list of related searches (called "Explorer pane" on the left side of search results), based on semantic technology from PowerSet which Microsoft purchased in 2008. Bing also includes the ability to Save & Share search histories via Windows Live SkyDrive, Face book, and e-mail. Most of the new features in Bing currently are only available in the version. Users can change their country settings in the toolbar at the top right-hand of the Bing site.
MSN Search was a search engine by Microsoft that comprised a search engine, index, and web crawler. MSN Search first launched in the fall of 1998 and used search results from Inktomi. In early 1999, MSN Search launched a version which displayed listings from Looksmart blended with results from Inktomi except for a short time in 1999 when results from AltaVista were used instead. Since then Microsoft upgraded MSN Search to provide its own Microsoft-built search engine results (list of web addresses with samples of content that meet a user's query), the index of which is updated weekly or even daily. The upgrade started as a beta program in November 2004 (based on several years of research), and came out of beta in February 2005. Image search was powered by a third party, Picsearch. The service also started providing its search results to other search engine portals in an effort to better compete in the market.
The first public beta of Windows Live Search was unveiled on March 8, 2006, with the final release on September 11, 2006 replacing MSN Search. The new search engine offered users the ability to search for specific types of information using search tabs that include Web, news, images, music, desktop, local, and Microsoft Encarta. Windows Live Search aimed to make its over 2.5 billion worldwide queries each month "more useful by providing consumers with improved access to information and more precise answers to their questions." A configuration menu is available to change the default search engine in Internet Explorer.
136. Mainframe computer
Mainframes (often colloquially referred to as Big Iron) are computers used mainly by large organizations for critical applications, typically bulk data processing such as census, industry and consumer statistics, ERP, and financial transaction processing. The term probably had originated from the early mainframes, as they were housed in enormous, room-sized metal boxes or frames. Later the term was used to distinguish high-end commercial machines from less powerful units.
Mainframes can add or hot swap system capacity non disruptively and granularly, again to a level of sophistication not found on most servers. Modern mainframes, notably the IBM zSeries, System z9 and System z10 servers, offer three levels of virtualization: logical partitions (LPARs, via the PR/SM facility), virtual machines (via the z/VM operating system), and through its operating systems (notably z/OS with its key-protected address spaces and sophisticated goal-oriented workload scheduling,[clarification needed] but also Linux, OpenSolaris and Java). This virtualization is so thorough, so well established, and so reliable that most IBM mainframe customers run no more than two machines: one in their primary data center, and one in their backup data center—fully active, partially active, or on standby—in case there is a catastrophe affecting the first building. All test, development, training, and production workload for all applications and all databases can run on a single machine, except for extremely large demands where the capacity of one machine might be limiting. Such a two-mainframe installation can support continuous business service, avoiding both planned and unplanned outages.
Mainframes are designed to handle very high volume input and output (I/O) and emphasize throughput computing. Since the mid-1960s, mainframe designs have included several subsidiary computers (called channels or peripheral processors) which manage the I/O devices, leaving the CPU free to deal only with high-speed memory. It is common in mainframe shops to deal with massive databases and files. Giga-record or tera-record files are not unusual. Compared to a typical PC, mainframes commonly have hundreds to thousands of times as much data storage online, and can access it much faster.[citation needed] While some other server families also offload certain I/O processing and emphasize throughput computing, they do not do so to the same degree and levels of sophistication.
Mainframe return on investment (ROI), like any other computing platform, is dependent on its ability to scale, support mixed workloads, reduce labor costs, deliver uninterrupted service for critical business applications, and several other risk-adjusted cost factors. Some argue that the modern mainframe is not cost-effective. Hewlett-Packard and Dell unsurprisingly take that view at least at times, and so do some independent analysts. Sun Microsystems also takes that view, but beginning in 2007 promoted a partnership with IBM which largely focused on IBM support for Solaris on its System x and BladeCenter products (and therefore unrelated to mainframes), but also included positive comments for the company's OpenSolaris operating system being ported to IBM mainframes as part of increasing the Solaris community. Some analysts (such as Gartner[citation needed]) claim that the modern mainframe often has unique value and superior cost-effectiveness, especially for large scale enterprise computing. In fact, Hewlett-Packard also continues to manufacture its own mainframe (arguably), the NonStop system originally created by Tandem. Logical partitioning is now found in many UNIX-based servers, and many vendors are promoting virtualization technologies, in many ways validating the mainframe's design accomplishments while blurring the differences between the different approaches to enterprise computing.
Mainframes also have execution integrity characteristics for fault tolerant computing. For example, z900, z990, System z9, and System z10 servers effectively execute result-oriented instructions twice, compare results, arbitrate between any differences (through instruction retry and failure isolation), then shift workloads "in flight" to functioning processors, including spares, without any impact to operating systems, applications, or users. This hardware-level feature, also found in HP's NonStop systems, is known as lock-stepping, because both processors take their "steps" (i.e. instructions) together. Not all applications absolutely need the assured integrity that these systems provide, but many do, such as financial transaction processing.
137. Environmental science
Environmental science is an expression encompassing the wide range of scientific disciplines that need to be brought together to understand and manage the natural environment and the many interactions among physical, chemical, and biological components. Environmental Science provides an integrated, quantitative, and interdisciplinary approach to the study of environmental systems. Individuals may operate as Environmental scientists or a group of scientists may work together pooling their individual skills. The most common model for the delivery of Environmental science is through the work of an individual scientist or small team drawing on the peer-reviewed, published work of many other scientists throughout the world.
Atmospheric sciences examine the phenomenology of the Earth's gaseous outer layer with emphasis upon interrelation to other systems. Atmospheric sciences comprises meteorological studies, greenhouse gas phenomena, atmospheric dispersion modeling of airborne contaminants, sound propagation phenomena related to noise pollution, and even light pollution Taking the example of the global warming phenomena, physicists create computer models of atmospheric circulation and infra-red radiation transmission, chemists examine the inventory of atmospheric chemicals and their reactions, biologists analyze the plant and animal contributions to carbon dioxide fluxes, and specialists such as meteorologists and oceanographers add additional breadth in understanding the atmospheric dynamics.
Ecology studies typically analyze the dynamics of biological populations and some aspect of their environment. These studies might address endangered species, predator/prey interactions, habitat integrity, effects upon populations by environmental contaminants, or impact analysis of proposed land development upon species viability. An interdisciplinary analysis of an ecological system which is being impacted by one or more stressors might include several related environmental science fields. For example one might examine an estuarine setting where a proposed industrial development could impact certain species by water pollution and air pollution. For this study biologists would describe the flora and fauna, chemists would analyze the transport of water pollutants to the marsh, physicists would calculate air pollution emissions and geologists would assist in understanding the marsh soils and bay mud’s.
Environmental chemistry is the study of chemical alterations in the environment. Principal areas of study include soil contamination and water pollution. The topics of analysis involve chemical degradation in the environment, multi-phase transport of chemicals (for example, evaporation of a solvent containing lake to yield solvent as an air pollutant), and chemical effects upon biota. As an example study, consider the case of a leaking solvent tank which has entered the soil up gradient of a habitat of an endangered species of amphibian. Physicists would develop a computer model to understand the extent of soil contamination and subsurface transport of solvent, chemists would analyze the molecular bonding of the solvent to the specific soil type and biologists would study the impacts upon soil arthropods, plants and ultimately pond dwelling copepods that are the food of the endangered amphibian.
138. Global Positioning System
The Global Positioning System (GPS) is a global navigation satellite system (GNSS) developed by the United States Department of Defense and managed by the United States Air Force 50th Space Wing. It is the only fully functional GNSS in the world, can be used freely by anyone, anywhere, and is often used by civilians for navigation purposes. It uses a constellation of between 24 and 32 medium Earth orbit satellites that transmit precise radio wave signals, which allow GPS receivers to determine their current location, the time, and their velocity. Its official name is NAVSTAR GPS. Although NAVSTAR is not an acronym, a few backronyms have been created for it.
Since it became fully operational on April 27, 1995, GPS has become a widely used aid to navigation worldwide, and a useful tool for map-making, land surveying, commerce, scientific uses, tracking and surveillance, and hobbies such as geocaching. Also, the precise time reference is used in many applications including the scientific study of earthquakes and as a required time synchronization method for cellular network protocols such as the IS-95 standard for CDMA.
A GPS receiver calculates its position by precisely timing the signals sent by the GPS satellites high above the Earth. Each satellite continually transmits messages containing the time the message was sent, precise orbital information (the ephemeris), and the general system health and rough orbits of all GPS satellites (the almanac). The receiver measures the transit time of each message and computes the distance to each satellite. Geometric trilateration is used to combine these distances with the location of the satellites to determine the receiver's location. The position is displayed, perhaps with a moving map display or latitude and longitude; elevation information may be included. Many GPS units also show derived information such as direction and speed, calculated from position changes.
It might seem three satellites are enough to solve for position, since space has three dimensions. However, even a very small clock error multiplied by the very large speed of light—the speed at which satellite signals propagate—results in a large positional error. Therefore receivers use four or more satellites to solve for x, y, z, and t, which is used to correct the receiver's clock. While most GPS applications use the computed location only and effectively hide the very accurately computed time, it is used in a few specialized GPS applications such as time transfer, traffic signal timing, and synchronization of cell phone base stations.
Although four satellites are required for normal operation, fewer apply in special cases. If one variable is already known (for example, a ship or plane may have known elevation), a receiver can determine its position using only three satellites. Some GPS receivers may use additional clues or assumptions (such as reusing the last known altitude, dead reckoning, inertial navigation, or including information from the vehicle computer) to give a degraded position when fewer than four satellites are visible.
139. Global Television Network
Global Television Network (more commonly called Global TV or just Global) is a Canadian English language privately owned television network. It is owned by Canwest Media Inc., a division of Canwest which is headquartered in Throughout the 1990s, it dominated primetime ratings in key markets such as southern and southwestern B.C., but had limited reach in certain areas such as Alberta until 2000.
Over the network's history, there has been some evidence that Global considers its news coverage subordinate to its usual primetime lineup of entertainment programming. While coverage of some breaking events has increased since the launch of Global National, the network attracted controversy in 2003 when CKND aired its usual programming schedule on the night of the Manitoba provincial election rather than providing any special news programming, and when CIII bumped its Ontario provincial election coverage to CHCH in order to avoid preempting Survivor. Both stations aired full election night coverage in those province's 2007 elections.
Global launched its first investigative newsmagazine series on November 30, 2008. The weekly 30-minute program, titled 16x9 - The Bigger Picture, features a high-gloss, tabloid format, and is the network's first foray into the field long occupied by CTV's W-FIVE and CBC's the fifth estate. Global also airs a weekly documentary series, Global Currents.
On October 4, 2007, parent company Canwest announced it would be centralizing news production control room functions for all owned & operated conventional TV stations (except CHBC Kelowna) at four broadcast centers - CHAN Vancouver, CITV Edmonton, CICT Calgary, and CIII Toronto. The company stated this would allow all of its stations to make a transition to high definition broadcasting, and create around 50 new jobs at the four stations. Approximately 250 positions were to be eliminated in the other stations, the majority of which were behind-the-scenes/technical positions.
A press release from the company has also stated that on-air talent (including weather anchors), reporters, producers, photographers, editors, and other news gathering positions will remain at the affected stations. This cost-cutting move by the company will be completed over several months. In mid August, Global Edmonton took over production of Global Halifax's newscasts, and on September 4, 2008, took over production of all newscasts at CHCA News in. Global is expected to begin production of Global Lethbridge's newscasts in mid-September, and later in the year Global Vancouver will take over production of both Global Regina and Global Saskatoon. Newscasts will also be produced on virtual sets.
140. Mitigation of global warming
Mitigation of global warming involves taking actions to reduce greenhouse gas emissions and to enhance sinks aimed at reducing the extent of global warming. This is in distinction to adaptation to global warming which involves taking action to minimize the effects of global warming. Scientific consensus on global warming, together with the precautionary principle and the fear of abrupt climate change is leading to increased effort to develop new technologies and sciences and carefully manage others in an attempt to mitigate global warming.
The Stern Review identifies several ways of mitigating climate change. These include reducing demand for emissions-intensive goods and services, increasing efficiency gains, increasing use and development of low-carbon technologies, and reducing non-fossil fuel emissions.
The energy policy of the European Union has set a target of limiting the global temperature rise to 2 °C [3.6 °F] compared to preindustrial levels, of which 0.8 °C has already taken place and another 0.5 °C is already committed. The 2 °C rise is typically associated in climate models with a carbon dioxide concentration of 400-500 ppm by volume; the current level as of January 2007 is 383 ppm by volume, and rising at 2 ppm annually. Hence, to avoid a very likely breach of the 2 °C target, CO2 levels would have to be stabilized very soon; this is generally regarded as unlikely, based on current programs in place to date. The importance of change is illustrated by the fact that world economic energy efficiency is presently improving at only half the rate of world economic growth.
At the core of most proposals is the reduction of greenhouse gas emissions through reducing energy use and switching to cleaner energy sources. Frequently discussed energy conservation methods include increasing the fuel efficiency of vehicles (often through hybrid, plug-in hybrid, and electric cars and improving conventional automobiles), individual-lifestyle changes and changing business practices. Newly developed technologies and currently available technologies including renewable energy (such as solar power, tidal and ocean energy, geothermal power, and wind power) and more controversially nuclear power and the use of carbon sinks, carbon credits, and taxation are aimed more precisely at countering continued greenhouse gas emissions. More radical proposals include geoengineering techniques ranging from carbon sequestration projects such as carbon dioxide air capture, to solar radiation management schemes such as the creation of stratospheric sulfur aerosols. The ever-increasing global population and the planned growth of national GDPs based on current technologies are counter-productive to most of these proposals.
141. Common Object Request Broker Architecture
The Common Object Requesting Broker Architecture (CORBA) is a standard defined by the Object Management Group (OMG) that enables software components written in multiple computer languages and running on multiple computers to work together.
CORBA is a mechanism in software for normalizing the method-call semantics between application objects that reside either in the same address space (application) or remote address space (same host, or remote host on a network). Version 1.0 was released in October 1991. CORBA uses an interface definition language (IDL) to specify the interfaces that objects will present to the outside world. CORBA then specifies a “mapping” from IDL to a specific implementation language like C++ or Java. Standard mappings exist for C, C++, Lisp, Ruby, Smalltalk, Java, COBOL, PL/I and Python. There are also non-standard mappings for Perl, Visual Basic, Erlang, and Tcl implemented by object request brokers (ORBs) written for those languages.
The CORBA specification dictates that there shall be an ORB through which the application interacts with other objects. In practice, the application simply initializes the ORB, and accesses an internal Object Adapter which maintains such issues as reference counting, object (and reference) instantiation policies, object lifetime policies, etc. The Object Adapter is used to register instances of the generated code classes. Generated Code Classes are the result of compiling the user IDL code which translates the high-level interface definition into an OS- and language-specific class base for use by the user application. This step is necessary in order to enforce the CORBA semantics and provide a clean user process for interfacing with the CORBA infrastructure.
Some IDL language mappings are "more hostile" than others. For example, due to the very nature of Java, the IDL-Java Mapping is rather straightforward and makes usage of CORBA very simple in a Java application. The C++ mapping is not "trivial" but accounts for all the features of CORBA, e.g. exception handling. The C-mapping is even stranger (since it's not an Object Oriented language) but it does make sense and handles the RPC semantics just fine. (Red Hat Linux delivers with the GNOME UI system, which used to have its IPC built on CORBA, now replaced by DBus)A "language mapping" requires the developer ("user" in this case) to create some IDL code that represents the interfaces to his objects. Typically, a CORBA implementation comes with a tool called an IDL compiler which converts the user's IDL code into some language-specific generated code. A traditional compiler then compiles the generated code to create the linkable-object files for the application.
142. .NET Framework
The Microsoft .NET Framework is a software framework that can be installed on computers running Microsoft Windows operating systems. It includes a large library of coded solutions to common programming problems and a virtual machine that manages the execution of programs written specifically for the framework. The .NET Framework is a key Microsoft offering and is intended to be used by most new applications created for the Windows platform.
The framework's Base Class Library provides a large range of features including user interface, data and data access, database connectivity, cryptography, web application development, numeric algorithms, and network communications. The class library is used by programmers, who combine it with their own code to produce applications.
Programs written for the .NET Framework execute in a software environment that manages the program's runtime requirements. Also part of the .NET Framework, this runtime environment is known as the Common Language Runtime (CLR). The CLR provides the appearance of an application virtual machine so that programmers need not consider the capabilities of the specific CPU that will execute the program. The CLR also provides other important services such as security, memory management, and exception handling. The class library and the CLR together constitute the .NET Framework.
Version 3.0 of the .NET Framework is included with Windows Server 2008 and Windows Vista. The current version of the framework can also be installed on Windows XP and the Windows Server 2003 family of operating systems. A reduced version of the .NET Framework is also available on Windows Mobile platforms, including smartphones as the .NET Compact Framework. Version 4.0 of the framework was released as a public Beta on 20 May 2009.
The intermediate CIL code is housed in .NET assemblies. As mandated by specification, assemblies are stored in the Portable Executable (PE) format, common on the Windows platform for all DLL and EXE files. The assembly consists of one or more files, one of which must contain the manifest, which has the metadata for the assembly. The complete name of an assembly (not to be confused with the filename on disk) contains its simple text name, version number, culture, and public key token. The public key token is a unique hash generated when the assembly is compiled, thus two assemblies with the same public key token are guaranteed to be identical from the point of view of the framework. A private key can also be specified known only to the creator of the assembly and can be used for strong naming and to guarantee that the assembly is from the same author when a new version of the assembly is compiled (required adding an assembly to the Global Assembly Cache).
143. Assembly language
Assembly languages are a family of low-level languages for programming computers. They implement a symbolic representation of the numeric machine codes and other constants needed to program a particular CPU architecture. This representation is usually defined by the hardware manufacturer, and is based on abbreviations (called mnemonics) that help the programmer remember individual instructions, registers, etc. An assembly language is thus specific to certain physical or virtual computer architecture (as opposed to most high-level languages, which are usually portable).
A utility program called an assembler is used to translate assembly language statements into the target computer's machine code. The assembler performs a more or less isomorphic translation (a one-to-one mapping) from mnemonic statements into machine instructions and data. (This is in contrast with high-level languages, in which a single statement generally results in many machine instructions.)
Many sophisticated assemblers offer additional mechanisms to facilitate program development, control the assembly process, and aid debugging. In particular, most modern assemblers (although many have been available for more than 40 years already) include a macro facility (described below), and are called macro assemblers.
There are two types of assemblers based on how many passes through the source are needed to produce the executable program. One pass assemblers go through the source code once and assume that all symbols will be defined before any instruction that references them. Two pass assemblers (and multi-pass assemblers) create a table with all unresolved symbols in the first pass, and then use the 2nd pass to resolve these addresses. The advantage in one pass assemblers is speed - which is not as important as it once was with advances in computer speed and capabilities. The advantage of the two-pass assembler is that symbols can be defined anywhere in the program source. As a result, the program can be defined in more logical and meaningful way. This makes two-pass assembler programs easier to read and maintain.
144. Nanotechnology
Nanotechnology, shortened to "Nanotech", is the study of the control of matter on an atomic and molecular scale. Generally nanotechnology deals with structures of the size 100 nanometers or smaller, and involves developing materials or devices within that size. Nanotechnology is very diverse, ranging from novel extensions of conventional device physics, to completely new approaches based upon molecular self-assembly, to developing new materials with dimensions on the nanoscale, even to speculation on whether we can directly control matter on the atomic scale.
There has been much debate on the future of implications of nanotechnology. Nanotechnology has the potential to create many new materials and devices with wide-ranging applications, such as in medicine, electronics, and energy production. On the other hand, nanotechnology raises many of the same issues as with any introduction of new technology, including concerns about the toxicity and environmental impact of nanomaterials , and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.
One nanometer (nm) is one billionth, or 10-9, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.12-0.15 nm, and a DNA double-helix has a diameter around 2 nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200 nm in length.
To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[4] Or another way of putting it: a nanometer is the amount a man's beard grows in the time it takes him to raise the razor to his face. Two main approaches are used in nanotechnology. In the "bottom-up" approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition. In the "top-down" approach, nano-objects are constructed from larger entities without atomic-level control.
A number of physical phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well as quantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, it becomes dominant when the nanometer size range is reached. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering mechanical, thermal and catalytic properties of materials. Diffusion and reactions at nanoscale, nanostructures materials and nanodevices with fast ion transport are generally referred to nanoionics. Novel mechanical properties of nanosystems are of interest in the nanomechanics research. The catalytic activity of nanomaterials also opens potential risks in their interaction with biomaterials.
145. Operating system
Operating system (commonly abbreviated to either OS or O/S) is an interface between hardware and user; it is responsible for the management and coordination of activities and the sharing of the resources of the computer. The operating system acts as a host for computing applications that are run on the machine. As a host, one of the purposes of an operating system is to handle the details of the operation of the hardware. This relieves application programs from having to manage these details and makes it easier to write applications. Almost all computers (including handheld computers, desktop computers, supercomputers, video game consoles) as well as some robots, domestic appliances (dishwashers, washing machines), and portable media players use an operating system of some type. Some of the oldest models may however use an embedded operating system, that may be contained on a compact disk or other data storage device.
Operating systems offer a number of services to application programs and users. Applications access these services through application programming interfaces (APIs) or system calls. By invoking these interfaces, the application can request a service from the operating system, pass parameters, and receive the results of the operation. Users may also interact with the operating system with some kind of software user interface (UI) like typing commands by using command line interface (CLI) or using a graphical user interface (GUI, commonly pronounced “gooey”). For hand-held and desktop computers, the user interface is generally considered part of the operating system. On large multi-user systems like UNIX and Unix-like systems, the user interface is generally implemented as an application program that runs outside the operating system. (Whether the user interface should be included as part of the operating system is a point of contention.)
The first microcomputers did not have the capacity or need for the elaborate operating systems that had been developed for mainframes and minis; minimalistic operating systems were developed, often loaded from ROM and known as Monitors. One notable early disk-based operating system was CP/M, which was supported on many early microcomputers and was closely imitated in MS-DOS, which became wildly popular as the operating system chosen for the IBM PC (IBM's version of it was called IBM DOS or PC DOS), its successors making Microsoft. In the 80's Apple Computer Inc. (now Apple Inc.) abandoned its popular Apple II series of microcomputers to introduce the Apple Macintosh computer with an innovative Graphical User Interface (GUI) to the Mac OS.
The introduction of the Intel 80386 CPU chip with 32-bit architecture and paging capabilities, provided personal computers with the ability to run multitasking operating systems like those of earlier minicomputers and mainframes. Microsoft responded to this progress by hiring Dave Cutler, who had developed the VMS operating system for Digital Equipment Corporation. He would lead the development of the Windows NT operating system, which continues to serve as the basis for Microsoft's operating systems line. Steve Jobs, a co-founder of Apple Inc., started NeXT Computer Inc., which developed the Unix-like NEXTSTEP operating system. NEXTSTEP would later be acquired by Apple Inc. and used, along with code from FreeBSD as the core of Mac OS X.
146. Porcupine
Porcupines are rodents with a coat of sharp spines, or quills that defend them from predators. . Porcupines are the third largest of the rodents, behind the capybara and the beaver. Most porcupines are about 25–36 in (630–910 mm) long, with an 8–10 in (200–250 mm) long tail. Weighing between 12–35 lb (5.4–16 kg), they are rounded, large and slow. Porcupines come in various shades of brown, grey, and the unusual white. Porcupines' spiny protection resembles that of the unrelated erinaceomorph hedgehogs and monotreme echidnas.
A porcupine is any of 27 species of rodent belonging to the families Erethizontidae or Hystricidae. Porcupines vary in size considerably: Rothschild's Porcupine of South America weighs less than a kilogram (2.2 lb (1.00 kg)); the African Porcupine can grow to well over 10 kg (22 lb). The two families of porcupines are quite different and although both belong to the Hystricognathi branch of the vast order Rodentia, they are not closely related.
The eleven porcupines are almost exclusively terrestrial, tend to be fairly large, and have quills that are grouped in clusters. They are believed to have separated from the other hystricognaths about 30 million years ago, much earlier than the porcupines.
The twelve New World porcupines are mostly smaller (although the North American Porcupine reaches about 85 cm/33 in in length and 18 kg/40 lb), have their quills attached singly rather than grouped in clusters, and are excellent climbers, spending much of their time in trees. The New World porcupines evolved their spines independently (through convergent evolution) and are more closely related to several other families of rodent than they are to the porcupines. Porcupines have a relatively high longevity and until recently held the record for being the longest living rodent, recently broken by the Naked Mole Rat.
147. Integrated development environment
An integrated development environment (IDE) also known as integrated design environment or integrated debugging environment is a software application that provides comprehensive facilities to computer programmers for software development. An IDE normally consists of a: Source code editor, Compiler and/or interpreter, Build automation tools, Debugger. Ides are designed to maximize programmer productivity by providing tightly-knit components with similar user interfaces. This should mean that the programmer has much less mode switching to do than when using discrete development programs. However, because an IDE is by its very nature a complicated piece of software, this high productivity only occurs after a lengthy learning curve.
Typically an IDE is dedicated to a specific programming language, so as to provide a feature set which most closely matches the programming paradigms of the language. However, some multiple-language IDEs are in use, such as Eclipse, ActiveState Komodo, recent versions of NetBeans, Microsoft Visual Studio and WinDev.
IDEs typically present a single program in which all development is done. This program typically provides many features for authoring, modifying, compiling, deploying and debugging software. The aim is to abstract the configuration necessary to piece together command line utilities in a cohesive unit, which theoretically reduces the time to learn a language, and increases developer productivity. It is also thought that the tight integration of development tasks can further increase productivity. For example, code can be compiled while being written, providing instant feedback on syntax errors. While most modern IDEs are graphical, IDEs in use before the advent of windowing systems (such as Microsoft Windows or X11) were text-based, using function keys or hotkeys to perform various tasks (Turbo Pascal is a common example). This contrasts with software development using unrelated tools, such as vi, GCC or make.
IDEs initially became necessary when developing via a console or terminal. Early languages did not have one, since they were prepared using flowcharts, coding before being submitted to a compiler. Dartmouth BASIC was the first language to be created with an IDE (and was also the first to be designed for use while sitting in front of a console or terminal). Its IDE (part of the Dartmouth Time Sharing System) was command-based, and therefore did not look much like the menu-driven, graphical IDEs prevalent today. However it integrated editing, file management, compilation, debugging and execution in a manner consistent with a modern IDE.
148. Information technology
Information technology (IT), as defined by the Information Technology Association of America (ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." IT deals with the use of electronic computers and computer software to convert, store, protect process, transmit, and securely retrieve information.
Today, the term information technology has ballooned to encompass many aspects of computing and technology, and the term has become very recognizable. The information technology umbrella can be quite large, covering many fields. IT professionals perform a variety of duties that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as the management and administration of entire systems.
When computer and communications technologies are combined, the result is information technology, or "infotech". Information technology is a general term that describes any technology that helps to produce, manipulate, store, communicate, and/or disseminate information. Presumably, when speaking of Information Technology (IT) as a whole, it is noted that the use of computers and information are associated.
The term information technology is sometimes said to have been coined by Jim Domsic in November 1981. Domsic, who worked as a computer manager for an automotive related industry, is supposed to have created the term to modernize the outdated phrase "data processing". The Oxford English Dictionary, however, in defining information technology as "the branch of technology concerned with the dissemination, processing, and storage of information, esp. by means of computers" provides an illustrative quote from the year 1958 (Leavitt & Whisler in Harvard Business Rev. XXXVI. 41/1 "The new technology does not yet have a single established name. We shall call it information technology.") that predates the so-far unsubstantiated Domsic coinage. In recent years ABET and the ACM have collaborated to form accreditation and curriculum standards for degrees in Information Technology as a distinct field of study separate from both Computer Science and Information Systems. SIGITE is the ACM working group for defining these standards.
149. Data compression
In computer science and information theory, data compression or source coding is the process of encoding information using fewer bits (or other information-bearing units) than an unencoded representation would use through use of specific encoding schemes.
As with any communication, compressed data communication only works when both the sender and receiver of the information understand the encoding scheme. For example, this text makes sense only if the receiver understands that it is intended to be interpreted as characters representing the English language. Similarly, compressed data can only be understood if the decoding method is known by the receiver.
Compression is useful because it helps reduce the consumption of expensive resources, such as hard disk space or transmission bandwidth. On the downside, compressed data must be decompressed to be used, and this extra processing may be detrimental to some applications. For instance, a compression scheme for video may require expensive hardware for the video to be decompressed fast enough to be viewed as it's being decompressed (the option of decompressing the video in full before watching it may be inconvenient, and requires storage space for the decompressed video). The design of data compression schemes therefore involves trade-offs among various factors, including the degree of compression, the amount of distortion introduced (if using a lossy compression scheme), and the computational resources required to compress and uncompress the data.
Lossless compression algorithms usually exploit statistical redundancy in such a way as to represent the sender's data more concisely without error. Lossless compression is possible because most real-world data has statistical redundancy. For example, in English text, the letter 'e' is much more common than the letter 'z', and the probability that the letter 'q' will be followed by the letter 'z' is very small.
Another kind of compression, called lossy data compression or perceptual coding, is possible if some loss of fidelity is acceptable. Generally, a lossy data compression will be guided by research on how people perceive the data in question. For example, the human eye is more sensitive to subtle variations in luminance than it is to variations in color. JPEG image compression works in part by "rounding off" some of this less-important information. Lossy data compression provides a way to obtain the best fidelity for a given amount of compression. In some cases, transparent (unnoticeable) compression is desired; in other cases, fidelity is sacrificed to reduce the amount of data as much as possible.
150. Microcontroller
A microcontroller (also microcontroller unit, MCU or µC) is a small computer on a single integrated circuit consisting of a relatively simple CPU combined with support functions such as a crystal oscillator, timers, watchdog, serial and analog I/O etc. Program memory in the form of NOR flash or OTP ROM is also often included on chip, as well as a, typically small, read/write memory.
Microcontrollers are designed for small applications. Thus, in contrast to the microprocessors used in personal computers and other high-performance applications, simplicity is emphasized. Some microcontrollers may operate at clock frequencies as low as 32kHz, as this is adequate for many typical applications, enabling low power consumption (milliwatts or microwatts). They will generally have the ability to retain functionality while waiting for an event such as a button press or other interrupt; power consumption while sleeping (CPU clock and most peripherals off) may be just nanowatts, making many of them well suited for long lasting battery applications.
Microcontrollers are used in automatically controlled products and devices, such as automobile engine control systems, remote controls, office machines, appliances, power tools, and toys. By reducing the size and cost compared to a design that uses a separate microprocessor, memory, and input/output devices, microcontrollers make it economical to digitally control even more devices and processes.
The majority of computer systems in use today are embedded in other machinery, such as automobiles, telephones, appliances, and peripherals for computer systems. These are called embedded systems. While some embedded systems are very sophisticated, many have minimal requirements for memory and program length, with no operating system, and low software complexity. Typical input and output devices include switches, relays, solenoids, LEDs, small or custom LCD displays, radio frequency devices, and sensors for data such as temperature, humidity, light level etc. Embedded systems usually have no keyboard, screen, disks, printers, or other recognizable I/O devices of a personal computer, and may lack human interaction devices of any kind.
It is mandatory that microcontrollers provide real time response to events in the embedded system they are controlling. When certain events occur, an interrupt system can signal the processor to suspend processing the current instruction sequence and to begin an interrupt service routine (ISR). The ISR will perform any processing required based on the source of the interrupt before returning to the original instruction sequence. Possible interrupt sources are device dependent, and often include events such as an internal timer overflow, completing an analog to digital conversion, a logic level change on an input such as from a button being pressed, and data received on a communication link. Where power consumption is important as in battery operated devices, interrupts may also wake a microcontroller from a low power sleep state where the processor is halted until required to do something by a peripheral event.
151. Work at home
Are you aware that you can do data entry work and many other varieties of internet jobs from the cozy comforts of your home and earn handsome compensation?
Well, even if you are oblivious about any of the above realities, there is nothing ridiculous. Probably, you are not the only person who is ignorant about fantastic freelancing ideas and opportunities available on the internet. I am sure that there must be millions of other people like you who are naïve about what wonders the internet has created. Anyway, it is never too late.
Let me tell you in one simple statement that internet is a medium that has thrown open plenty of exciting opportunities for earning legitimate income. Carry on reading further and I am damn sure that you would be tempted too fast too soon to get a hold of a tiny proportion of multi million Dollar outsourcing and freelancing business on the internet. Anybody irrespective of age, gender, religion, and country of residence can take up online internet jobs. If you are employed full time with some company, you can do your online jobs during your spare time and earn extra income. If you are unemployed you can take up online jobs for full time and earn enough money to live a happy life. If you are a student, you can earn money to pay for education expenses. For home making moms it is the ideal choice to keep her knowledge and creativity alive and at the same time earn substantial remuneration.
152.Indian Hockey Federation The Indian Hockey Federation was the Indian branch of the International Hockey Federation. In April, 2008, Kandaswamy Jothikumaran, the IHF's Secretary General, resigned after a television show accused him of corruption. K. P. S. Gill, IHF chief for 14 years, lost his position [12] when the Indian Hockey Federation was suspended by the Indian Olympic Association (IOA) on April 28. The Indian Olympic Association appointed a new five-member national selection committee. This panel will work in conjunction with the International Field Hockey Federation in managing hockey in India.The panel was headed by Aslam Sher Khan, a former MP and former hockey captain and includes Ashok Kumar, Ajit Pal Singh, Zafar Iqbal and Dhanraj Pillay. Aslam Sher Khan has now been replaced by Ajit Pal Singh as the chairman of the national selection committee. Aslam Sher Khan was highly displeased by this decision, though he remained as a selector.
In a 30 April 2008interview with India Today, Khan indicated the impact of the 2007 film about the National Women's Hockey Team, Chak De India, on his future strategy by stating that he wants "to create a 'Chak De' effect" within Indian hockey.
The Indian Women's Hockey Team is the national women's team representing field hockey in India. Captain Suraj Lata Devi led the team to the Gold for three consecutive years: during the 2002 Commonwealth Games (the event which inspired the 2007 Bollywood hit film, Chak De India), the 2003 Afro-Asian Games, and the 2004 Hockey Asia Cup. They were referred to as the assi (Jasjeet) jaisi koi nahi or "Golden girls of hockey," after winning the 2004 Hockey Asia Cup.
153. Tennis Tennis is a sport played between two players (singles) or between two teams of two players each (four players) (doubles). Each player uses a strung racquet to strike a hollow rubber ball covered with felt (most of the time Optic Yellow but can be any color or even two-tone) over a net into the opponent's court. The modern game of tennis originated in the United Kingdom in the late 19th century as "lawn tennis". This game had heavy connections to the ancient game of real tennis. After its creation, tennis spread throughout the upper-class English-speaking population before spreading around the world. Tennis is an Olympic sport and is played at all levels of society at all ages. The sport can be played by anyone who can hold a racket, including people in wheelchairs. In the United States, there is a collegiate circuit organized by the National Collegiate Athletics Association.
Except for the adoption of the tiebreaker in the 1970s, the rules of tennis have changed very little since the 1890s. A recent addition to professional tennis has been the adoption of "instant replay" technology coupled with a point challenge system, which allows a player to challenge the official call of a point. Along with its millions of players, tons of people worldwide follow tennis as a spectator sport, especially the four Grand Slam tournaments (sometimes referred to as the "majors"): the Australian Open, the French Open, Wimbledon, and the US Open.
Tennis is played on a rectangular, flat surface, usually grass, clay, or a hardcourt of concrete and/or asphalt. The court is 78 feet (23.77 m) long, and its width is 27 feet (8.23 m) for singles matches and 36 ft (10.97 m) for doubles matches. Additional clear space around the court is required in order for players to reach overrun balls. A net is stretched across the full width of the court, parallel with the baselines, dividing it into two equal ends. The net is 3 feet 6 inches (1.07 m) high at the posts and 3 feet (91.4 cm) high in the center.
The modern tennis court owes its design to Major Walter Clopton Wingfield who, in 1873, patented a court much the same as the current one for his stické tennis (sphairistike). This template was modified in 1875 to the court design that exists today, with markings similar to Wingfield's version, but with the hourglass shape of his court changed to a rectangle. The lines that delineate the width of the court are called the baseline (farthest back) and the service line (middle of the court). The short mark in the center of each baseline is referred to as either the hash mark or the center mark. The outermost lines that make up the length are called the doubles sidelines. These are the boundaries used when doubles is being played. 154.Chess Chess is a recreational and competitive game played between two players. The current form of the game emerged in Southern Europe during the second half of the 15th century after evolving from similar, much older games of Indian and Persian origin. Today, chess is one of the world's most popular games, played by millions of people worldwide at home, in clubs, online, by correspondence, and in tournaments.
The game is played on a square chequered chessboard with 64 squares arranged in an eight-by-eight grid. At the start, each player (one controlling the white pieces, the other controlling the black pieces) controls sixteen pieces: one king, one queen, two rooks, two knights, two bishops, and eight pawns. The object of the game is to checkmate the opponent's king, whereby the king is under immediate attack (in "check") and there is no way to remove it from attack on the next move.
The tradition of organized competitive chess started in the 16th century and has developed extensively. Chess today is a recognized sport of the International Olympic Committee. The first official World Chess Champion, Wilhelm Steinitz, claimed his title in 1886; Viswanathan Anand is the current World Champion. Theoreticians have developed extensive chess strategies and tactics since the game's inception. Aspects of art are found in chess composition.
One of the goals of early computer scientists was to create a chess-playing machine. Today's chess is deeply influenced by the abilities of current chess programs and the ability to play against others online. In 1997, Deep Blue became the first computer to beat the reigning World Champion in a match when it defeated Garry Kasparov. 155. Barack Obama Barack Hussein Obama II ; born August 4, 1961) is the 44th and current President of the United States. He is the first African American to hold the office. Obama was the junior United States Senator from Illinois from January 2005 until November 2008, when he resigned after his election to the presidency.Obama is a graduate of Columbia University and Harvard Law School, where he was the first African American president of the Harvard Law Review. He was a community organizer in Chicago before earning his law degree. He worked as a civil rights attorney in Chicago and also taught constitutional law at the University of Chicago Law School from 1992 to 2004 Obama served three terms in the Illinois Senat from 1997 to 2004. Following an unsuccessful bid for a seat in the U.S. House of Representatives in 2000, Obama ran for United States Senate in 2004. His victory from a crowded field in the March 2004 Democratic primary raised his visibility, and his prime-time televised keynote address at the Democratic National Convention in July 2004 made him a rising star nationally in the Democratic Party. He was elected to the U.S. Senate in November 2004 by the largest margin in Illinois history.He began his run for the presidency in February 2007. After a close campaign in the 2008 Democratic Party presidential primaries against Hillary Rodham Clinton, he won his party's nomination, becoming the first major party African American candidate for president. In the 2008 general election, he defeated Republican candidate John McCain and was inaugurated as president on January 20, 2009.
Obama was elected to the Illinois Senate in 1996, succeeding State Senator Alice Palmer as Senator from Illinois's 13th District, which at that time spanned Chicago South Side neighborhoods from Hyde Park-Kenwood south to South Shore and west to Chicago Lawn. Once elected, Obama gained bipartisan support for legislation reforming ethics and health care laws. He sponsored a law increasing tax credits for low-income workers, negotiated welfare reform, and promoted increased subsidies for childcare. In 2001, as co-chairman of the bipartisan Joint Committee on Administrative Rules, Obama supported Republican Governor Ryan's payday loan regulations and predatory mortgage lending regulations aimed at averting home foreclosures.
Obama was reelected to the Illinois Senate in 1998, defeating Republican Yesse Yehudah in the general election, and was reelected again in 2002. In 2000, he lost a Democratic primary run for the U.S. House of Representatives to four-term incumbent Bobby Rush by a margin of two to one.In January 2003, Obama became chairman of the Illinois Senate's Health and Human Services Committee when Democrats, after a decade in the minority, regained a majority. He sponsored and led unanimous, bipartisan passage of legislation to monitor racial profiling by requiring police to record the race of drivers they detained, and legislation making Illinois the first state to mandate videotaping of homicide interrogations. During his 2004 general election campaign for U.S. Senate, police representatives credited Obama for his active engagement with police organizations in enacting death penalty reforms. Obama resigned from the Illinois Senate in November 2004 following his election to the U.S. Senate.
156.2009 Flu A(H1N1) Minor outbreaks of swine influenza occurred in humans in 1976 and 1988, and in pigs in 1998 and 2007.In the 2009 swine flu outbreak, the virus isolated from patients in the United States was found to be made up of genetic elements from four different flu viruses – North American Mexican influenza, North American avian influenza, human influenza, and swine influenza virus typically found in Asia and Europe – "an unusually mongrelised mix of genetic sequences." This new strain appears to be a result of reassortment of human influenza and swine influenza viruses, in all four different strains of subtype H1N1. However, as the virus has not yet been isolated in animals to date and also for historical naming reasons, the World Organisation for Animal Health (OIE) suggests it be called "North-American influenza". On April 30, 2009 the World Health Organization began referring to the outbreak as "Influenza A" instead of "swine flu". and later began referring to it as "Influenza A(H1N1)".Several complete genome sequences for U.S. flu cases were rapidly made available through the Global Initiative on Sharing Avian Influenza Data (GISAID). Preliminary genetic characterization found that the hemagglutinin (HA) gene was similar to that of swine flu viruses present in U.S. pigs since 1999, but the neuraminidase (NA) and matrix protein (M) genes resembled versions present in European swine flu isolates. The six genes from American swine flu are themselves mixtures of swine flu, bird flu, and human flu viruses. While viruses with this genetic makeup had not previously been found to be circulating in humans or pigs, there is no formal national surveillance system to determine what viruses are circulating in pigs in the U.S
The more recent Russian flu was a 1977–1978 flu epidemic caused by strain Influenza A/USSR/90/77 (H1N1). It infected mostly children and young adults under 23 because a similar strain was prevalent in 1947–57, causing most adults to have substantial immunity. Some have called it a flu pandemic but because it only affected the young it is not considered a true pandemic. The virus was included in the 1978–1979 influenza vaccine.
157. Information Tchnology Information technology (IT), as defined by the Information Technology Association of America (ITAA), is "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." IT deals with the use of electronic computers and computer software to convert, store, protect, process, transmit, and securely retrieve information. Today, the term information technology has ballooned to encompass many aspects of computing and technology, and the term has become very recognizable. The information technology umbrella can be quite large, covering many fields. IT professionals perform a variety of duties that range from installing applications to designing complex computer networks and information databases. A few of the duties that IT professionals perform may include data management, networking, engineering computer hardware, database and software design, as well as the management and administration of entire systems.
When computer and communications technologies are combined, the result is information technology, or "infotech". Information technology is a general term that describes any technology that helps to produce, manipulate, store, communicate, and/or disseminate information. Presumably, when speaking of Information Technology (IT) as a whole, it is noted that the use of computers and information are associated.
The term information technology is sometimes said to have been coined by Jim Domsic of Michigan in November 1981. Domsic, who worked as a computer manager for an automotive related industry, is supposed to have created the term to modernize the outdated phrase "data processing". The Oxford English Dictionary, however, in defining information technology as "the branch of technology concerned with the dissemination, processing, and storage of information, esp. by means of computers" provides an illustrative quote from the year 1958 (Leavitt & Whisler in Harvard Business Rev. XXXVI. 41/1 "The new technology does not yet have a single established name. We shall call it information technology.") that predates the so-far unsubstantiated Domsic coinage.
158. Data Entry-India In response to the pressures of increased competition and economic slowdown, many companies are seeking to significantly reduce their cost base. So companies are facing the challenge of needing to invest in strategic IT developments at the same time as cutting the cost of the IT budge. At Data Entry India we have been helping clients to meet this challenge for more than a decade through our tailored outsourcing solutions.Outsourcing your Internet Content Preparation and Data Entry requirements to professionals can potentially save your organization money and remove bottlenecks. Put your annual reports and other documents on the Internet. Make an online library of your publications, which is available 24 hrs x 7 days, anytime, anyplace. Data Entry India is an Indian business process and IT services company. Through strong relationships, commercial innovation and our integrated Indian delivery capability, we drive real and long-term cost reductions, performance improvements and new ways of working tailored to each client. Data Entry India is the premier provider of digital content outsourcing services. Data Entry India encourage an open developmental process with our clients, keeping you aware and involved in every detail of the project. Our clients remain our primary concern, assuring constant communication to fulfill every project requirement through our commitment to quality service. We work hard to evaluate your project needs and suggest the most effective means to showcase the appeal of your products and services. Whether creating an interactive presentation to ascertain market presence or large scale data capture and data conversion services for a broader market, Data Entry India can easily find the solution that is right for your company. Data Entry India specializes in bringing your business online with innovative technologies and world-class design. Data Entry India is committed to delivering seamless, consistent services of the highest standard and in the most cost-effective manner to our clients. Data Entry India provides a complete paper-to-electronic solution. Through our data processing division, we offer you a wide range of services including large-scale data entry, document conversion, scanning solutions and database management, just to name o few. Data Entry India has the expertise to handle large Internet content preparation projects. Our growth is the result of providing our clients with state-of-the-art, cost-effective solutions for delivering content.
159. “Real” Work At Home Jobs Finding a legitimate work at home job is harder than finding a traditional job. First of all, despite all the online advertising you see, there aren't that many of them. Those that are available may require that you live in a certain area or spend at least some time in the office. Others may be part-time or freelance, so, you will need to be continually seeking potential positions. Keep in mind that the skills needed for home employment are similar to those needed for working in an office. You need both the experience and the skills necessary to do the job. You'll also need a home office with phone, fax, computer, printer, software, and other basic office equipment.
To get started, consider, for now, your job search as your job. Dedicate as many hours per week to your search for employment as you would spend working. If you're looking for full-time work, you should be spending full-time hours seeking employment. Networking remains the top way to find a job and it does work. Develop contacts - friends, family, college alumni, even the other job seekers who participate in the Job Searching Discussion Forum - anyone who might help generate information and job leads. You can take a direct approach and ask for job leads or try a less formal approach and ask for information and advice. Contact everyone you know and tell them you want to work from home. You may be surprised by the people they know and the leads you can generate.
Check the sites that list work at home jobs and look through all the listings and remember to take advantage of the Resume Posting section, if the site has one. That way companies seeking employers will be able to find your resume. Use the jobsea enrchgines using terms like "work at home" "telecommute" and "freelance." Next, search the online job banks using keywords like "work at home" "telecommute" and "telecommuting" Searching Monster, for example, using "telecommuting" as a keyword generates almost 200 listings. "Work at home" generates close to a 1000 positions. Searching Yahoo! HotJobs brought similar results.
This is a time when it make sense not to simply search the web search engines. I've found that searching for "work at home" most often brings up scams or web sites that want to charge you for providing "real" work at home jobs or for "proven successful" home business information. Rather, stick with the sites that focus on employment. Be prepared to apply online. Have a resume and cover letter ready to send. Depending on the type of employment you're looking for you may also need work samples to send to prospective employers. Track where you've applied. Many of the same positions are listed on multiple sites, so you'll want to be sure not to duplicate your efforts.
160. Sania Mirza Born to Imran Mirza and Nasima Mirza, Sania began playing lawn tennis at the young age of six. C.G.Krishna Bhupathi, father of Indian tennis star Mahesh Bhupathi was her coach when she started playing at the Nizam club in her home town of Hyderabad. She learnt professional tennis at the Sinnet tennis academy in Secunderabad and later moved to the Ace tennis academy in USA. Her first appearance in the international arena was when she represented India at the World Junior Championship in 1999 held at Jakarta, Indonesia. Sania Mirza created history when she reached the third round of Australian Open 2005. She became the first Indian to achieve the feat. Sania Mirza has also been honoured with the prestigious Arjuna award by the Indian government for the year 2004.
Sania admires Steffi Graf and her favourite movie is Ocean's 11. She adores Biryani and her favourite colours are red and black.18-year old Indian tennis star Sania Mirza has been honoured by the Indian government by being conferred the prestigious Arjuna award for sports persons. The Arjuna award recognises outstanding achievenement in the field of sports at the international level. It carries a cash award of Indian Rupees 300,000. The awards will be presented by the Indian president Abdul Kalam on 29 August, 2005 in his official residence Rashtrapathi Bhawan, New Delhi, India. India's Sania Mirza came back strongly after her first round defeat in the French Open to storm into the quarter-finals of the DFS classic WTA tournament being held in Birmingham, UK.
In April 2003, Mirza made her debut in the India Fed Cup team, winning all three singles matches. Mirza won the 2003 Wimbledon Championships Girls' Doubles title, teaming up with Alisa Kleybanova of Russia.Mirza is the highest ranked female tennis player ever from India, with a career high ranking of 27 in singles and 18 in doubles. She holds the distinction of being the first Indian woman to be seeded in a Grand Slam tennis tournament. Earlier in 2005, she had become the first Indian woman to reach the fourth round of a Grand Slam tournament at the 2005 U.S. Open, defeating Mashona Washington, Maria Elena Camerin and Marion Bartoli. In 2004, she finished runner-up at the Asian Tennis Championship. In winning, with Mahesh Bhupathi, the Mixed Doubles event at the 2009 Australian Open, she became the first Indian woman to win any grand slam event.
In 2005, Mirza reached the third round of the Australian Open, losing to eventual champion Serena Williams. On February 12, 2005, she became the first Indian woman to win a WTA singles title, defeating Alyona Bondarenko of Ukraine in the Hyderabad Open Finals. As of September 2006, Mirza has notched up three top 10 wins; against Svetlana Kuznetsova, Nadia Petrova and Martina Hingis. At the 2006 Doha Asian Games, Mirza won the silver in the women's singles category and the gold in the mixed doubles partnering Leander Paes. She was also part of the Indian women's team that won the silver in the team event. 161. 3g Mobiles 3G is the third generation of telecommunication hardware standards and general technology for mobile networking, superseding 2.5G. It is based on the International Telecommunication Union (ITU) family of standards under the IMT-2000.3G networks enable network operators to offer users a wider range of more advanced services while achieving greater network capacity through improved spectral efficiency. Services include wide-area wireless voice telephone, video calls, and broadband wireless data, all in a mobile environment. Additional features also include HSPA data transmission capabilities, which provides users with data rates up to 14.4 Mbit/s on the downlink and 5.8 Mbit/s on the uplink.
Unlike IEEE 802.11 networks, which are commonly called Wi-Fi or WLAN networks, 3G networks are wide-area cellular telephone networks which provide High-speed Internet access and video telephony to 3G Network subscribers. IEEE 802.11 networks are short range, high-bandwidth networks primarily developed for data.
By June 2007 the 200 millionth 3G subscriber had been connected. Out of 3 billion mobile phone subscriptions worldwide this is only 6.7%. In the countries where 3G was launched first - Japan and South Korea - 3G penetration is over 70%. In Europe the leading country is Italy with a third of its subscribers migrated to 3G. Other leading countries by 3G migration include UK, Austria, Australia and Singapore at the 20% migration level. A confusing statistic is counting CDMA 2000 1x RTT customers as if they were 3G customers. If using this definition, then the total 3G subscriber base would be 475 million at June 2007 and 15.8% of all subscribers worldwide.
Still several major countries such as Indonesia have not awarded 3G licenses and customers await 3G services. China delayed its decisions on 3G for many years, mainly because of their Government's delay in establishing well defined standards. China announced in May 2008, that the telecoms sector was re-organized and three 3G networks would be allocated so that the largest mobile operator, China Mobile, would retain its GSM customer base. China Unicom would retain its GSM customer base but relinquish its CDMA2000 customer base, and launch 3G on the globally leading WCDMA (UMTS) standard. The CDMA2000 customers of China Unicom would go to China Telecom, which would then launch 3G on the CDMA 1x EV-DO standard. This meant that China would have all three main cellular technology 3G standards in commercial use. Finally in January 2009, Ministry of industry and Information Technology of China has awarded licenses of all three standards,TD-SCDMA to China Mobile, WCDMA to China Unicom and CDMA2000 to China Telecom.
162. Global Warming Global warming is the increase in the average temperature of the Earth's near-surface air and oceans since the mid-twentieth century and its projected continuation. Global surface temperature increased 0.74 ± 0.18 °C (1.33 ± 0.32 °F) during the last century. The Intergovernmental Panel on Climate Change (IPCC) concludes that anthropogenic greenhouse gases are responsible for most of the observed temperature increase since the middle of the twentieth century, while natural phenomena such as solar variation and volcanoes produced most of the warming from pre-industrial times to 1950 and had a small cooling effect afterward. These basic conclusions have been endorsed by more than 40 scientific societies and academies of science, including all of the national academies of science of the major industrialized countries.
Climate model projections summarized in the latest IPCC report indicate that global surface temperature will probably rise a further 1.1 to 6.4 °C (2.0 to 11.5 °F) during the twenty-first century. The uncertainty in this estimate arises from the use of models with differing climate sensitivity, and the use of differing estimates of future greenhouse gas emissions. Some other uncertainties include how warming and related changes will vary from region to region around the globe. Most studies focus on the period up to 2100. However, warming is expected to continue beyond 2100 even if emissions stop, because of the large heat capacity of the oceans and the long lifetime of carbon dioxide in the atmosphere.
Increasing global temperature will cause sea levels to rise and will change the amount and pattern of precipitation, probably including expansion of subtropical deserts. The continuing retreat of glaciers, permafrost and sea ice is expected, with the Arctic region being particularly affected. Other likely effects include shrinkage of the Amazon rainforet and Boreal forests, increases in the intensity of extreme weather events, species extinctions and changes in agricultural yields.
Political and public debate continues regarding the appropriate response to global warming. The available options are mitigation to reduce further emissions; adaptation to reduce the damage caused by warming; and, more speculatively, geoengineering to reverse global warming. Most national governments have signed and ratified the Kyoto Protocol aimed at reducing greenhouse gas emissions.
163. Taj Mahal The Taj Mahal : is a mausoleum located in Agra, India, built by Mughal Emperor Shah Jahan in memory of his favorite wife, Mumtaz Mahal.The Taj Mahal (also "the Taj") is considered the finest example of Mughal architectur, a style that combines elements from Persian, Ottoman, Indian, and Islamic architectural styles. In 1983, the Taj Mahal became a UNESCO World Heritage Site and was cited as "the jewel of Muslim art in India and one of the universally admired masterpieces of the world's heritage."
While the white domed marble mausoleum is its most familiar component, the Taj Mahal is actually an integrated complex of structures. Building began around 1632 and was completed around 1653, and employed thousands of artisans and craftsmen. The construction of the Taj Mahal was entrusted to a board of architects under imperial supervision including Abd ul-Karim Ma'mur Khan, Makramat Khan, and the Persian Ustad Ahmad Lahaui, although Lahauri is generally considered to be the principal designer
The minarets, which are each more than 40 meters tall, display the designer's penchant for symmetry. They were designed as working minarets — a traditional element of mosques, used by the muezzin to call the Islamic faithful to prayer. Each minaret is effectively divided into three equal parts by two working balconies that ring the tower. At the top of the tower is a final balcony surmounted by a chattri that mirrors the design of those on the tomb. The chattris all share the same decorative elements of a lotus design topped by a gilded finial. The minarets were constructed slightly outside of the plinth so that, in the event of collapse, (a typical occurrence with many tall constructions of the period) the material from the towers would tend to fall away from the tomb. The exterior decorations of the Taj Mahal are among the finest to be found in Mughal architecture. As the surface area changes the decorations are refined proportionally. The decorative elements were created by applying paint, stucco, stone inlays, or carvings. In line with the Islamic prohibition against the use of anthropomorphic forms, the decorative elements can be grouped into either calligraphy, abstract forms or vegetative motifs.
164 Viswanathan Anand Viswanathan Anand ,: (born 11 December 1969) is an Indian chess grandmaster and the current World Chess Champion.Anand held the FIDE World Chess Championship from 2000 to 2002, at a time when the world title was split. He became the undisputed World Champion in 2007 and defended his title against Vladimir Kramnik in 2008. With this win, he became the first player in chess history to have won the World Championship in three different formats: Knockout, Tournament, and Match. He will next defend his title in the World Chess Championship 2009 against Veselin Topalov, the winner of a challenger match against Gata Kamsky in February 2009.
Anand is one of four players in history to break the 2800 mark on the FIDE rating list. He was at the top of the world rankings five out of six times, from April 2007 to July 2008. In October 2008, he dropped out of the world top three ranking for the first time since July 1996.In 2007 he was awarded India's second highest civilian award, the Padma Vibhushan. He is also the first recipient of Rajiv Gandhi Khel Ratna Award in 1991-92, India's highest sporting honour.
Anand was born on 11 December 1969 in Chennai, Tamil Nadu, to Vishwanathan, who retired as General Manager, Southern Railways, and Susheela, housewife and chess/film/club aficionado and an influential socialite. He has a brother and a sister.He was taught to play by his mother. He described his start in chess in a conversation with Susan Polgar:
I started when I was six. My mother taught me how to play. In fact, my mother used to do a lot for my chess. We moved to the Philippines shortly afterward. I joined the club in India and we moved to the Philippines for a year. And there they had a TV program that was on in the afternoon, one to two or something like that, when I was in school. So she would write down all the games that they showed and the puzzles, and in the evening we solved them together.
Of course my mother and her family used to play some chess, and she used to play her younger brother, so she had some background in chess, but she never went to a club or anything like that.So we solved all these puzzles and sent in our answers together. And they gave the prize of a book to the winner. And over the course of many months, I won so many prizes. At one point they just said take all the books you want, but don't send in anymore entries.Anand holds a degree in commerce and his hobbies are reading, swimming & listening to music. He lives in Collado Mediano in Spain with his wife Aruna.
165. Viswanathan Anand:Early career Anand's rise in the Indian chess world was meteoric. National level success came early for him when he won the National Sub-Junior Chess Championship with a score of 9/9 in 1983 at the age of fourteen. He became the youngest Indian to win the International Master Title at the age of fifteen, in 1984. At the age of sixteen he became the national chess champion and won that title two more times. He played games at blitz speed. In 1987, he became the first Indian to win the World Junior Chess Championship. In 1988, at the age of eighteen, he became India's first Grandmaster by winning Shakti Finance International chess tournament held in Coimbatore, India. He was awarded Padma Shri at the age of 18.
"Vishy", as he is sometimes called by his friends, burst upon the upper echelons of the chess scene in the early 1990s, winning such tournaments as Reggio Emilia 1991 (ahead of Garry Kasparov and Anatoly Karpov). Playing at such a high level did not slow him down, and he continued to play games at blitz speed.
In the World Chess Championship 1993 cycle Anand qualified for his first Candidates Tournament, winning his first match but narrowly losing his quarter-final match to Anatoly Karpov.
In 1994-95 Anand and Gata Kamsky dominated the qualifying cycles for the rival FIDE and PCA world championships. In the FIDE cycle (FIDE World Chess Championship 1996), Anand lost his quarter-final match to Kamsky after leading early. Kamsky went on to make championship match against Karpov.
In the 1995 PCA cycle, Anand won matches against Oleg Romanishin and Michael Adams without a loss, then avenged his FIDE loss by defeating Gata Kamsky in the Candidates final. In 1995, he played the PCA World Chess Championship 1995 against Kasparov in New York City's World Trade Center. After an opening run of eight draws (a record for the opening of a world championship match), Anand won game nine with a powerful exchange sacrifice, but then lost four of the next five. He lost the match 10.5 - 7.5
Anand successfully defended the title against Kramnik in the World Chess Championship 2008 held between October 14 and October 29 in Bonn, Germany. The winner was to be the first to score 6.5 points in the twelve-game match. Anand won by scoring 6.5 points in 11 games. After the tenth game, Anand led 6-4 and needed only a draw in either of the last two games to win the match. In the eleventh game, Kramnik played the Najdorf Variation of the Sicilian Defense. Once the players traded queens, Kramnik offered a draw after 24 moves since he had no winning chances in the endgame.
167. Bill Gates William Henry "Bill" Gates III (born October 28, 1955) is an American business magnate, philanthropist, author, and chairmanof Microsoft, the software company he founded with Paul Allen. He is ranked consistently one of the world's wealthiest people and the wealthiest overall as of 2009.During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder with more than 8 percent of the common stock. He has also authored or co-authored several books.
Gates is one of the best-known entrepreneurs of the personal computer revolution. Although he is admired by many, a number of industry insiders criticize his business tactics, which they consider anti-competitive, an opinion which has in some cases been upheld by the courts. In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.
Bill Gates stepped down as chief executive officer of Microsoft in January, 2000. He remained as chairman and created the position of chief software architect. In June, 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect and Craig Mundie, chief research and strategy officer. Gates' last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairman. After reading the January 1975 issue of Popular Electronics that demonstrated the Altair 8800, Gates contacted Micro Instrumentation and Telemetry Systems (MITS), the creators of the new microcomputer, to inform them that he and others were working on a BASIC interpreter for the platformIn reality, Gates and Allen did not have an Altair and had not written code for it; they merely wanted to gauge MITS's interest. MITS president Ed Roberts agreed to meet them for a demo, and over the course of a few weeks they developed an Altair emulator that ran on a minicomputer, and then the BASIC interpreter. The demonstration, held at MITS's offices in Albuquerque, was a success and resulted in a deal with MITS to distribute the interpreter as Altair BASIC. Paul Allen was hired into MITS, and Gates took a leave of absence from Harvard to work with Allen at MITS in Albuquerque in November 1975. They named their partnership "Micro-Soft" and had their first office located in Albuquerque. Within a year, the hyphen was dropped, and on November 26, 1976, the trade name "Microsoft" was registered with the Office of the Secretary of the State of New Mexico.
168. Bill Gates:Early Life Gates was born in Seattle, Washington, to William H. Gates, Sr. and Mary Maxwell Gates, who was of Scottish descent. His family was upper middle class; his father was a prominent lawyer, his mother served on the board of directors for First Interstate BancSystem and the United Way, and her father, J. W. Maxwell, was a national bank president. Gates has one elder sister, Kristi (Kristianne), and one younger sister, Libby. He was the fourth of his name in his family, but was known as William Gates III or "Trey" because his father had dropped his own "III" suffix. Early on in his life, Gates' parents had a law career in mind for him. At 13 he enrolled in the Lakeside School, an exclusive preparatory school. When he was in the eighth grade, the Mothers Club at the school used proceeds from Lakeside School's rummage sale to buy an ASR-33 teletype terminal and a block of computer time on a General Electric (GE) computer for the school's students. Gates took an interest in programming the GE system in BASIC and was excused from math classes to pursue his interest. He wrote his first computer program on this machine: an implementation of tic-tac-toe that allowed users to play games against the computer.
At the end of the ban, the four students offered to find bugs in CCC's software in exchange for computer time. Rather than use the system via teletype, Gates went to CCC's offices and studied source code for various programs that ran on the system, including programs in FORTRAN, LISP, and machine language. The arrangement with CCC continued until 1970, when the company went out of business. The following year, Information Sciences Inc. hired the four Lakeside students to write a payroll program in COBOL, providing them computer time and royalties. After his administrators became aware of his programming abilities, Gates wrote the school's computer program to schedule students in classes. He modified the code so that he was placed in classes with mostly female students. He later stated that "it was hard to tear myself away from a machine at which I could so unambiguously demonstrate success. At age 17, Gates formed a venture with Allen, called Traf-O-Data, to make traffic counters based on the Intel 8008 processor In early 1973, Bill Gates served as a congressional page in the U.S. House of Representatives. 169.Bill Gates:Personal Life Gates married Melinda French from Dallas, Texas on January 1, 1994. They have three children: Jennifer Katharine (1996), Rory John (1999) and Phoebe Adele (2002). The Gateses' home is an earth-sheltered house in the side of a hill overlooking Lake Washington in Medina, Washington. According to King County public records, as of 2006 the total assessed value of the property (land and house) is $125 million, and the annual property tax is $991,000.
His 66,000 sq. ft. estate has a 60-foot swimming pool with an underwater music system, as well as a 2500 sq. ft. gym and a 1000 sq. ft. dining room.
Also among Gates's private acquisitions is the Codex Leicester, a collection of writings by Leonardo da Vinci, which Gates bought for $30.8 million at an auction in 1994.[Gates is also known as an avid reader, and the ceiling of his large home library is engraved with a quotation from The Great Gatsby.He also enjoys playing bridge, tennis, and golf.
Gates was number one on the "Forbes 400" list from 1993 through to 2007 and number one on Forbes list of "The World's Richest People" from 1995 to 2007 and 2009. In 1999, Gates's wealth briefly surpassed $101 billion, causing the media to call him a "centibillionaire". Since 2000, the nominal value of his Microsoft holdings has declined due to a fall in Microsoft's stock price after the dot-com bubble burst and the multi-billion dollar donations he has made to his charitable foundations. In a May 2006 interview, Gates commented that he wished that he were not the richest man in the world because he disliked the attention it brought. Gates has several investments outside Microsoft, which in 2006 paid him a salary of $616,667, and $350,000 bonus totalling $966,667. He founded Corbis, a digital imaging company, in 1989. In 2004 he became a director of Berkshire Hathaway, the investment company headed by long-time friend Warren Buffett. Time magazine named Gates one of the 100 people who most influenced the 20th century, as well as one of the 100 most influential people of 2004, 2005, and 2006. Time also collectively named Gates, his wife Melinda and rock band U2's lead singer Bono as the 2005 Persons of the Year for their humanitarian efforts. In 2006, he was voted eighth in the list of "Heroes of our time". Gates was listed in the Sunday Times power list in 1999, named CEO of the year by Chief Executive Officers magazine in 1994, ranked number one in the "Top 50 Cyber Elite" by Time in 1998, ranked number two in the Upside Elite 100 in 1999 and was included in The Guardian as one of the "Top 100 influential people in media" in 2001.
170. MS. Dhoni The spectacular arrival of Virender Sehwag was bound to inspire others to bat with the same mindset. But the odds of a clone emerging from the backwaters of Jharkhand, whose state side has consistently scraped the bottom, was highly remote. That was until Mahendra Singh Dhoni arrived.He can be swashbuckling with the bat and secure with the wicketkeeping gloves. His neck-length hair adds to his dash. Though Dhoni made his first-class debut in the 1999-2000 season, it was only in 2004 that he became a serious contender for national selection with some stirring performances when the occasion demanded. With his two centuries against Pakistan A, in the triangular tournament in Kenya, that he established himself as a clinical destroyer of bowling attacks.
In just his fifth one-dayer, against Pakistan at Vishakapatnam, he cracked a dazzling 148 - putting even Sehwag in the shade - and followed that up with a colossal 183 not out at Jaipur against Sri Lanka in November, when he broke Adam Gilchrist's record for the highest score by a wicketkeeper in ODIs. He made an instant impact on the Test level too, pounding 148 at Faisalabad, in only his fifth Test.He was elevated to the vice-captaincy of the one-day squad for the tour of England and Ireland in 2007 and, soon after, was appointed captain of the Twenty20 squad for the World Championship in South Africa. A heady title triumph marked him out as a leader for the future and was handed over the reins of the one-day side in September 2007 after Rahul Dravid decided to step down as captain. It didn't take too long for him to enhance his reputation, claiming India's first tri-series triumph in Australia. He captained Chennai Super Kings in the IPL, losing out to Shane Warne's Rajasthan Royals in a tense final. As a stop-gap Test captain, he was credited with leading India to their biggest ever win in terms of runs (320), against Australia in Mohali.
Dhoni is an aggressive right-handed batsman and wicket-keeper. Dhoni is one of the number of wicket-keepers who have come through the ranks of junior and India A cricket teams to represent the national team — Parthiv Patel, Ajay Ratra and Dinesh Karthik also followed this route. Dhoni, referred to as 'Mahi' by his friends, debuted in the Bihar cricket team during the 1998/99 cricket season and was selected to represent India-A for a tour to Kenya in 2004. Along with Gautam Gambhir, Dhoni made multiple centuries against the Pakistan-A team in a tri-nation series and was selected in the Indian national team later in that year. Dhoni tends to play mostly from the back foot with a pronounced bottom hand grip.He has a very fast hand speed through the ball which often results in the ball racing across the ground. From this initial stance his feets do not show much movement which sometimes results in chasing balls while not coming to the pitch of the ball or inside edging a lot of balls.Dhoni scored 148 against Pakistan in his fifth ODI match in 2005 — then the highest score by an Indian wicketkeeper. Later in the year, he broke his own record as well as set the current world record for the highest score in the second innings in ODI matches as he scored 183* against Sri Lanka. Dhoni's success in the limited overs format secured him a place in the test team. Consistent performances in ODI cricket through the end of the 2005/06 season saw Dhoni briefly ranked as the No. 1 batsman in the ICC ODI ratings.
171. Yuvraj Singh Yuvraj Singh, born 12 December 1981 in Chandigarh, India.He is a cricketer from India, and the son of former Indian fast bowler and Punjabi movie star Yograj Singh. He has been a member of the Indian cricket team since 2000 (ODIs) and played his first Test match in 2003. He was the vice captain of the ODI team from late-2007 to late-2008. At the 2007 Worl Twenty20 he hit six sixes in an over against England's Stuart Broad - a feat only performed three times previously in any form of senior cricket, and never previously in an international match between two Test cricket nations. Yuvraj is primarily a left-handed batsman but can bowl part-time left-arm orthodox spin. He is regarded as being better at batting against fast bowling than spin bowling, and cites the Indian Oil Cup 2005 as a turning point in his career. He is one of the better fielders in the Indian team, fielding primarily at point, with a good aim at the stumps. A report published in late 2005 showed that since 1999, he was the fourth most prolific fielder in effecting ODI run outs, and of those on the list of prolific fielders, he had the second highest rate of effecting a run out. He was previously often characterized as having attitude problems, but later often assumed leadership positions during Rahul Dravid's tenure as captain. Yuvraj made his One Day International debut against Kenya at Nairobi in 2000, at the ICC KnockOut Trophy. He showed his potential in his second ODI which was against the Australians where he scored a quickfire 84 off 82 balls against a quality pace attack consisting of bowlers like Glenn McGrath, Brett Lee and Jason Gillespie. However, after a lean run of form, he was dropped for the one-dayers against Australia in India in early 2001, but returned later in the year and helped India to victory in a match in Sri Lanka with an unbeaten 98. One of his most memorable innings was a partnership with Mohammad Kaif in the NatWest Series final against England in July 2002 which led India to victory. He was selected and represented India at the 2003 Cricket World Cup. He scored his first century in his fourth season with the Indian team against Bangladesh in 2003. After that he also scored hundreds against Zimbabwe and Australia, including a 139 off 119 balls at the Sydney Cricket Ground. In the Indian Oil Cup 2005, he made 110 (off 114 balls) (his third century) and an important partnership worth 165 runs with Mohammad Kaif, to become the man of the match against West Indies in the last match of the round robin league. After reaching his century, he attracted attention by angry gesticulations to the Indian dressing room, which was postulated to be due to his clashes with team management - Greg Chappell had been appointed as the new Indian coach and he had criticised Yuvraj. He later praised Chappell's techniques 172. Dhirubhai Ambani Dhirubhai Ambani was born on 28 December 1932 at Chorwad, Junagadh (now the state of Gujarat, India) to Hirachand Gordhanbhai Ambani and Jamnaben in a Modh Bania family of modest means. He was the second son of a school teacher. When he was 16 years old, he moved to Aden, Yemen. He worked with A. Besse & Co. for a salary of Rs.300. Two years later, A. Besse & Co. became the distributors for Shell products, and Dhirubhai was promoted to manage the company’s filling station at the port of Aden.He was married to Kokilaben and had two sons, Mukesh and Anil, and two daughters, Nita Kothari and Rina Salgaonkar.In 1962, Dhirubhai returned to India and started Reliance.Reliance was to import polyester yarn and export spices.
The business was setup in partnership with Champaklal Damani, his second cousin, who used to be with him in Aden, Yemen. The first office of the Reliance Commercial Corporation was set up at the Narsinatha Street in Masjid Bunder. It was 350 sq ft (33 m2). room with a telephone, one table and three chairs. Initially, they had two assistants to help them with their business. In 1965, Champaklal Damani and Dhirubhai Ambani ended their partnership and Dhirubhai started on his own. It is believed that both had different temperaments and a different take on how to conduct business. While Mr. Damani was a cautious trader and did not believe in building yarn inventories, Dhirubhai was a known risk taker and he believed in building inventories, anticipating a price rise, and making profits. . In 1968, he moved to an upmarket apartment at Altamount Road in South Mumbai. Ambani's net worth was estimated at about Rs.10 lakh by late 1970s.
Asia Times quotes: "His people skills were legendary. A former secretary reveals: "He was very helpful. He followed an 'open-door' policy. Employees could walk into his cabin and discuss their problems with him." The chairman had a special way of dealing with different groups of people, be they employees, shareholders, journalists or government officials. Ambani's competitors allege that he bought off officials and had legislation re-written to suit him. They recall his earlier days and how he picked up the art of profiteering from the then-Byzantine system of controls of Indian officialdom. He exported spices, often at a loss, and used replenishment licenses to import rayon. Later, when rayon started to be manufactured in India, he exported rayon, again at a loss, and imported nylon. Ambani was always a step ahead of the competitors. With the imported items being heavily in demand, his profit margins were rarely under 300 percent."
173. Mukesh Ambani Mukesh Ambani (born on April 19, 1957 in Aden, Yemen) is an Indian businessman. He is the chairman, managing director and the largest shareholder of Reliance Industries, India's largest private sector enterprise and a Fortune 500 company. His personal stake in Reliance Industries is 48%. His wealth is valued at US$19.5 billion (according to Forbes), making him the richest Indian in the world, the richest man in Asia as well as world's 7th richest person.Mukesh and younger brother Anil are sons of the late founder of Reliance Industries, Dhirubhai Ambani. Mukesh also owns the Indian Premier League team Mumbai Indians.
Mukesh Ambani joined Reliance in 1981 and initiated Reliance's backward integration from textiles into polyester fibres and further into petrochemicals. In this process, he directed the creation of 60 new, world-class manufacturing facilities involving diverse technologies that have raised Reliance's manufacturing capacities from less than a million tonnes to twelve million tonnes per year.
He directed and led the creation of the world's largest grassroots petroleum refinery at Jamnagar, Gujarat, India, with a present capacity of 660,000 barrels per day (105,000 m³/d) (33 million tonnes per year) integrated with petrochemicals, power generation, port and related infrastructure, at an investment of Rs 100000 crore (nearly $26 billion USD).He will soon inaugurate his second refinary in motikhavdi, jamnagar
Mukesh Ambani set up one of the largest telecommunications companies in India in the form of Reliance Communications (formerly Reliance Infocom) Limited. However, Reliance Infocom now is under Anil Dhirubhai Ambani Group post the brothers' split. Had the two brothers not split, and Mukesh being the president , his net worth would have been around $45 billion, behind Walton family. Under Ambani's leadership, Reliance has entered retail business through its wholly owned subsidiary Reliance Retail.
Under him Reliance Retail has also launched a new chain called Delight stores and also signed a letter of intent with NOVA Chemicals to make energy-efficient structures for Reliance Retail.Ambani owns the Indian Premier League team Mumbai Indians.He is also a member of Council on Foreign Relations Mukesh Ambani is now the co-chairman of Reliance Industries Limited.
174.Anil Ambani Anil Ambani (born June 4, 1959) is an Indian billionaire and a major shareholder in Anil Dhirubhai Ambani Group. Anil's elder brother, Mukesh Ambani is also a billionaire, and owns another company called Reliance Industries.Ambani joined Reliance, the company founded by his late father Dhirubhai Ambani, in 1983 as Co-Chief Executive Officer and is credited with having pioneered many financial innovations in the Indian capital markets. For example, he led India's first forays into overseas capital markets with international public offerings of global depositary receipts, convertibles and bonds. He directed Reliance in its efforts to raise, since 1991, around US$2 billion from overseas financial markets; with a 100-year Yankee bond issue in January 1997 being the high point, after which people regarded him as a financial wizard. He along with his brother, Mukesh Ambani, has steered the Reliance Group to its current status as India's leading textiles, petroleum, petrochemicals, power, and telecom company.
Anil was the member of Uttar Pradesh Development Council (this council has now been scrapped). He is also the Chairman of Board of Governors of DA-IICT, Gandhinagar and a member of the Board of Governors of the Indian Institute of Technology, Kanpur. He is member of the Board of Governors, Indian Institute of Management, Ahmedabad. He is also a member of the Central Advisory Committee, Central Electricity Regulatory Commission. In June 2004, Anil was elected as an Independent Member of the Rajya Sabha - Upper House, Parliament of India with the support of the Samajwadi Party. In March 2006, he resigned. In 2007 his name was added to the list of Indian Trillionaires (in terms of Indian Rupee). He has been linked with several starlets in his long career including his current wife of more than 15 years. He is a close friend of movie star Amitabh Bachchan. One of his major achievements in the entertainment industry is the takeover of Adlabs, the movie production to distribution to multiplex company that owns Mumbai's only dome theatre.On the evening of 23 April 2009, mud, gravel and pebbles were found in his 13-seat helecopter VT-RCL's (a Bell 412) gearbox. At a height of 10 feet, the gravel and pebbles were put in (by someone) the filler cap in the gear box. A senior pilot of Reliance Transport and Travels Pvt. Ltd.
The helicopter was standing outside a hangar at the Mumbai Airport when the sabotage was found. Bharat Borge, the man who found the pebbles in the helicopter was found dead on April 28, 2009 on Mumbai's suburban railway tracks between Vile Parle and Andheri. A letter was also found with him. Railway Police believes that he might have been ran over by Churchgate bound fast-local. "Borge's mysterious death created a flutter, lending credence to Anil Ambani's charge that certain 'rival business groups were trying to eliminate him'.Airworks India Engineering Pvt. Ltd., the company that maintains the helicopter filed a case against one of its own employees for putting mud and pebbles into the gear box.
175. Blue- ray Disc Blu-ray Disc (also known as Blu-ray or BD) is an optical disc storage medium designed by Sony to supersede the standard DVD format. Its main uses are high-definition video and data storage with 50GB per disc. The disc has the same physical dimensions as standard DVDs and CDs.The name Blu-ray Disc derives from the blue laser used to read the disc. While a standard DVD uses a 650 nanometre red laser, Blu-ray uses a shorter wavelength, a 405 nm blue laser, and allows for almost six times more data storage than on a DVD.
During the format war over high-definition optical discs, Blu-ray competed with the HD DVD format. Toshiba, the main company supporting HD DVD, ceded in February 2008 and the format war ended.Blu-ray Disc is developed by the Blu-ray Disc Association, a group representing makers of consumer electronics, computer hardware, and motion pictures. As of January 2009, more than 890 Blu-ray disc titles are available in Australia, 720 in Japan, 1,140 in the United Kingdom, and 1,500 in the United States. Commercial HDTV sets began to appear in the consumer market around 1998, but there was no commonly-accepted, inexpensive way to record or play HD content. In fact, there was no medium with the storage required to accommodate HD codecs, except JVC's Digital VHS and Sony's HDCAM. Nevertheless, it was well known that using lasers with shorter wavelengths would enable optical storage with higher density. When Shuji Nakamura invented practical blue laser diodes, it was a sensation, although a lengthy patent lawsuit delayed commercial introduction. The Blu-ray Disc physical specifications were finished in 2004. In January 2005, TDK announced that they had developed a hard coating polymer for Blu-ray Discs The cartridges, no longer necessary, were scrapped. The BD-ROM specifications were finalized in early 2006. AACS LA, a consortium founded in 2004, had been developing the DRM platform that could be used to securely distribute movies to consumers. However, the final AACS standard was delayed, and then delayed again when an important member of the Blu-ray Disc group voiced concerns. At the request of the initial hardware manufacturers, including Toshiba, Pioneer and Samsung, an interim standard was published which did not include some features, like managed copy. Sony started two projects applying the new diodes: UDO (Ultra Density Optical) and DVR Blue (together with Pioneer), a format of rewritable discs which would eventually become Blu-ray Disc (more specifically, BD-RE). The core technologies of the formats are essentially similar.
176. Bermuda Triangle The Bermuda Triangle, also known as the Devil's Triangle, is a region in the western part of the North Atlantic Ocean in which a number of aircraft and surface vessels are alleged to have disappeared in mysterious circumstances which fall beyond the boundaries of human error, piracy, equipment failure, or natural disasters. Popular culture has attributed some of these disappearances to the paranormal, a suspension of the laws of physics, or activity by extraterrestrial beings.
While a substantial body of documentation exists showing numerous incidents to have been inaccurately reported or embellished by later authors, and numerous official agencies have gone on record as stating that the number and nature of disappearances is similar to any other area of ocean, many incidents remain unexplained despite considerable investigation.
The boundaries of the triangle cover the Straits of Florida, the Bahamas and the entire Caribbean island area and the Atlantic east to the Azores; others add to it the Gulf of Mexico. The more familiar triangular boundary in most written works has as its points somewhere on the Atlantic coast of Florida; San Juan, Puerto Rico; and the mid-Atlantic island of Bermuda, with most of the accidents concentrated along the southern boundary around the Bahamas and the Florida Straits.
The area is one of the most heavily-sailed shipping lanes in the world, with ships crossing through it daily for ports in the Americas, Europe, and the Caribbean Islands. Cruise ships are also plentiful, and pleasure craft regularly go back and forth between Florida and the islands. It is also a heavily flown route for commercial and private aircraft heading towards Florida, the Caribbean, and South America from points north.
177. Piranha A piranha or piraña is a member of a family of omnivorous freshwater fish which live in South American rivers. In Venezuelan rivers, they are called caribes. They are known for their sharp teeth and a voracious appetite for meat. Piranhas belong to the subfamily Serrasalminae, which also includes closely related herbivorous fish including pacus. Traditionally, only the four genera Pristobrycon, Pygocentrus, Pygopristis and Serrasalmus are considered to be true piranhas, due to their specialized teeth. However, a recent analysis showed that, if the piranha group is to be monophyletic, it should be restricted to Serrasalmus, Pygocentrus and part of Pristobrycon, or expanded to include these taxa plus Pygopristis, Catoprion, and Pristobrycon striolatus. Pygopristis was found to be more closely related to Catoprion than the other three piranha genera. The total number of piranha species is unknown and new species continue to be described. In 1988, it was stated that fewer than a half of the approximately 60 nominal species of piranhas at the time were valid. More recently (in 2003), one author recognized a total of 38 or 39 species, although the validity of some taxa remains questionable. Piranhas are found only in the Amazon basin, in the Orinoco, in rivers of the Guyanas, in the Paraguay-Paraná, and in the São Francisco River systems; some species of piranha have broad geographic ranges, occurring in more than one of the major basins mentioned above, whereas others appear to have more limited distributions.
However, piranha (inevitably former aquarium-dwellers) have been introduced into parts of the United States, even being occasionally found in the Potomac River, although they typically do not survive the cold winters of that region.Piranha have also been discovered in the Kaptai Lake in south-east Bangladesh. Research is being carried out to establish how piranha have moved to such distant corners of the world from their original habitat. It is anticipated that rogue exotic fish traders have released them in the lake to avoid being caught by anti-poaching forces
Locals often use piranha teeth to make tools and weapons. Piranha are also a popular food, although if an individual piranha is caught on a hook or line, it may be attacked by other (free) piranhas.Piranha are commonly consumed by subsistence fishermen and often sold for food in local markets. In recent decades, dried specimens have been marketed as tourist souvenirs. Piranhas occasionally bite and sometimes injure bathers and swimmers. A piranha bite is sometimes considered more an act of carelessness than that of misfortune, but piranhas are a considerable nuisance to commercial and sport fishers because they steal bait, mutilate catch, damage nets and other gear and may bite when handled.
178. Solar Energy Solar energy is the radiant light and heat from the Sun that has been harnessed by humans since ancient times using a range of ever-evolving technologies. Solar radiation along with secondary solar resources such as wind and wave power, hydroelectricity and biomass account for most of the available renewable energy on Earth. Only a minuscule fraction of the available solar energy is used.
Solar power provides electrical generation by means of heat engines or photovoltaics. Once converted its uses are only limited by human ingenuity. A partial list of solar applications includes space heating and cooling through solar architecture, potable water via distillation and disinfection, daylighting, hot water, thermal energy for cooking, and high temperature process heat for industrial purposes.
Solar technologies are broadly characterized as either passive solar or active solar depending on the way they capture, convert and distribute sunlight. Active solar techniques include the use of photovoltaic panels, solar thermal collectors, with electrical or mechanical equipment, to convert sunlight into useful outputs. Passive solar techniques include orienting a building to the Sun, selecting materials with favorable thermal mass or light dispersing properties, and designing spaces that naturally circulate air.Solar energy refers primarily to the use of solar radiation for practical ends. However, all renewable energies, other than geothermal and tidal, derive their energy from the sun.
Solar technologies are broadly characterized as either passive or active depending on the way they capture, convert and distribute sunlight. Active solar techniques use photovoltaic panels, pumps, and fans to convert sunlight into useful outputs. Passive solar techniques include selecting materials with favorable thermal properties, designing spaces that naturally circulate air, and referencing the position of a building to the Sun. Active solar technologies increase the supply of energy and are considered supply side technologies, while passive solar technologies reduce the need for alternate resources and are generally considered demand side technologies.
179. Wind Energy Wind power is the conversion of wind energy into a useful form, such as electricity, using wind turbines. At the end of 2008, worldwide nameplate capacity of wind-powered generators was 121.2 gigawatts. Wind power produces about 1.5% of worldwide electricity use, and is growing rapidly, having doubled in the three years between 2005 and 2008. Several countries have achieved relatively high levels of wind power penetration, such as 19% of stationary electricity production in Denmark, 11% in Spain and Portugal, and 7% in Germany and the Republic of Ireland in 2008. As of May 2009, eighty countries around the world are using wind power on a commercial basis.
Large-scale wind farms are typically connected to the local electric power transmission network; smaller turbines are used to provide electricity to isolated locations. Utility companies increasingly buy back surplus electricity produced by small domestic turbines. Wind (and solar) energy as power sources is favoured by environmentalists as an alternative to fossil fuels, because they are plentiful, renewable, widely distributed, clean, and produce no greenhouse gas emissions; however, the construction of wind farms is not universally welcomed due to their visual impact and other effects on the environment.
Wind power, along with solar power, is non-dispatchable, meaning that for economic operation all of the available output must be taken when it is available, and other resources, such as hydropower, must be used to match supply with demand. The intermittency of wind seldom creates problems when using wind power to supply a low proportion of total demand. Where wind is to be used for a moderate fraction of demand, additional costs for compensation of intermittency are considered to be modest.
The Earth is unevenly heated by the sun resulting in the poles receiving less energy from the sun than the equator does. Also, the dry land heats up (and cools down) more quickly than the seas do. The differential heating drives a global atmospheric convection system reaching from the Earth's surface to the stratosphere which acts as a virtual ceiling. Most of the energy stored in these wind movements can be found at high altitudes where continuous wind speeds of over 160 km/h (100 mph) occur. Eventually, the wind energy is converted through friction into diffuse heat throughout the Earth's surface and the atmosphere. 180. Wind Electricity Generation Electricity generated by a wind farm is normally fed into the national electric power transmission network. Individual turbines are interconnected with a medium voltage (usually 34.5 kV) power collection system and communications network. At a substation, this medium-voltage electrical current is increased in voltage with a transformer for connection to the high voltage transmission system. The surplus power produced by domestic microgenerators can, in some jurisdictions, be fed back into the network and sold back to the utility company, producing a retail credit for the consumer to offset their energy costs.
Induction generators, often used for wind power projects, require reactive power for excitation so substations used in wind-power collection systems include substantial capacitor banks for power factor correction. Different types of wind turbine generators behave differently during transmission grid disturbances, so extensive modelling of the dynamic electromechanical characteristics of a new wind farm is required by transmission system operators to ensure predictable stable behaviour during system faults (see: Low voltage ride through). In particular, induction generators cannot support the system voltage during faults, unlike steam or hydro turbine-driven synchronous generators (however, properly matched power factor correction capacitors along with the electronic control of resonance can support induction generation without a grid). Doubly-fed machines—wind turbines with solid-state converters between the turbine generator and the collector system—generally have more desirable properties for grid interconnection. Transmission systems operators will supply a wind farm developer with a grid code to specify the requirements for interconnection to the transmission grid. This will include power factor, constancy of frequency and dynamic behaviour of the wind farm turbines during a system fault.
Electricity generated from wind power can be highly variable at several different timescales: from hour to hour, daily, and seasonally. Annual variation also exists, but is not as significant. Because instantaneous electrical generation and consumption must remain in balance to maintain grid stability, this variability can present substantial challenges to incorporating large amounts of wind power into a grid system. A series of detailed modelling studies which looked at the Europe wide adoption of renewable energy and interlinking power grids using HVDC cables, indicates that the entire power usage could come from renewables, with 70% total energy from wind at the same sort of costs or lower than at present. Intermittency would be dealt with, according to this model, by a combination of geographic dispersion to de-link weather system effects, and the ability of HVDC to shift power from windy areas to non-windy areas. 181.Sachin Tendulkar Sachin Ramesh Tendulkar is an Indian cricketer widely regarded as one of the greatest batsmen in the history of cricket. In 2002, Wisden ranked him the second greatest Test batsman of all time, next to Donald Bradman, and the second greatest one day international (ODI) batsman of all time, next to Viv Richards. In September 2007, the Australian leg spinner Shane Warne rated Tendulkar as the greatest player he has played with or against. Tendulkar was the only player of the current generation to be included in Bradman's Eleven, the dream team of Donald Bradman, published in his biography. He is sometimes referred to as Little Master or Master Blaster.
Tendulkar is the highest run scorer in both Test matches and ODIs, and also the batsman with the most centuries in either form of the game.The first player to score fifty centuries in all international cricket combined, he now has more than eighty international centuries.On October 17, 2008, when he surpassed Brian Lara's record for the most runs scored in Test Cricket, he also became the first batsman to score 12,000 runs in that form of the game, having also been the third batsman and first Indian to pass 11,000 runs in Test cricket. He was also the first player to score 10,000 runs in one-day internationals, and also the first player to cross every subsequent 1000-run mark that has been crossed in ODI cricket history. In the fourth Test of the Border-Gavaskar Trophy against Australia, Tendulkar surpassed Australia's Allan Border to become the player to cross the 50-run mark the most number of times in Test cricket history, and also the second ever player to score 10 Test centuries against Australia, after only Sir Jack Hobbs of England more than 70 years back. Tendulkar has been honored with the Padma Vibhushan award, India's second highest civilian award, and the Rajiv Gandhi Khel Ratna award, India's highest sporting honor. Tendulkar's performance through the years 1994–1999 coincided with his physical peak, in his early twenties. On the day of the Hindu festival Holi, Tendulkar was told to open the batting at Auckland against New Zealand in 1994. He went on to make 82 runs off 49 balls. He scored his first ODI century on September 9, 1994 against Australia in Sri Lanka at Colombo.
In 1996 against Pakistan in Sharjah, Indian captain Mohammed Azharuddin was going through a lean patch. Tendulkar and Navjot Singh Sidhu both made centuries to set a record partnership for the second wicket. After getting out, Tendulkar found Azharuddin in two minds to bat out. Tendulkar boosted Azharuddin to bat and Azharuddin subsequently unleashed 29 runs in mere 10 balls. It enabled India post a score in excess of 300 runs for the first time. India went on to win that match. 182.Sachin Tendulkar’Career On December 11, 1988, aged just 15 years and 232 days, Tendulkar scored 100 not-out in his debut first-class match for Mumbai against Gujarat, making him the youngest cricketer to score a century on his first-class debut. His first double century was for Mumbai while playing against the visiting Australian team at the Brabourne Stadium in 1998.Tendulkar is the only player to score a century in all three of his Ranji Trophy, Duleep Trophy and Irani Trophy debuts.
In 1992, at the age of 19, Tendulkar became the first overseas born player to represent Yorkshire (Craig White, although born in Yorkshire was the first player to be signed as an overseas player by Yorkshire. He had to be listed as an overseas player as he had already played for Victoria in Australia). Tendulkar played 16 first-class matches for the county and scored 1070 runs at an average of 46.52.His first Test match against Pakistan in Karachi in 1989 under the leadership of Kris Srikkanth. According to Cricinfo's Andrew Miller and Martin Williamson, India took an unconventional approach to combating the Pakistani pace attack by calling up a "baby-faced 16-year-old with one season of first-class cricket to his name". He made just 15 runs, being bowled by Waqar Younis, who also made his debut in that match, but was impressive in how he handled numerous blows to his body at the hands of the Pakistani pace attack. Tendulkar followed it up with his maiden Test fifty a few days later at Faisalabad. His One Day International (ODI) debut on December 18 was disappointing. He was dismissed without scoring a run, again by Waqar Younis. The series was followed by a tour of New Zealand in which he fell for 88 in the Second Test. His maiden Test century came in the next tour, to England in August 1990 at Old Trafford. Tendulkar further enhanced his development into a world-class batsman during the 1991–1992 tour of Australia that included an unbeaten 148 in Sydney (the first of many battles against Shane Warne who made his debut in the match) and a century on the fast and bouncy track at Perth. Merv Hughes famously commented to Allan Border at the time that "This little prick's going to get more runs than you, AB.
A chronic back problem flared up when Pakistan toured India in 1999, with India losing the historic Test at Chepauk despite a gritty century from Tendulkar himself. The worst was yet to come as Professor Ramesh Tendulkar, Tendulkar's father, died in the middle of the 1999 Cricket World Cup. Tendulkar flew back to India to attend the final rituals of his father, missing the match against Zimbabwe. However, he returned with a bang to the World cup scoring a century (unbeaten 140 off 101 balls) in his very next match against Kenya in Bristl. He dedicated this century to his father.
183. E-Bike Electric bikes are a new and promising alternative form of urban transportation. They provide all the advantages of a regular bicycle: fun exercise, free parking, zero emissions, and freedom from gridlock, while eliminating one of the bicycle's more serious drawbacks, lack of power. Imagine pedalling up a hill as comfortably as riding down, that's what the e-bike experience is all about. In most situations in the city, riding an electric bike will be faster and cheaper than either car or public transit.
Fundamentally, an e-bike is just a regular bicycle with an electric motor to provide additional assistance. You can pedal normally and just use the motor to help out on hills and headwinds, or use the motor all the time just to make riding easier. The experience is entirely different from riding say a gas scooter or motorbike. Here the electric assistance is perfectly smooth and silent, and it complements rather than supplants human power.
The short answer is yes but not much. The effect of weight is largely exaggerated in how a bicycle performs. People spend thousands to shave off a few pounds for a really high-end bike. But since the rider is already at least 5-6 times heavier than the bike, the vehicle weight itself makes minimal difference. A heavier bicycle is slightly harder to ride uphill, somewhat faster to ride downhill, and pretty much the same on the flat as a lightweight one.
The addition of a motor and batteries can add anywhere from 20 to 40 lb to a bike and has suprisingly little effect on its rideability. My university textbooks weigh a comparable amount and aren't nearly so helpful on the road. You definitely do notice the weight if you have to pick the bike up and carry it for any reason though, and it can be a bit unwieldy.Now those 20-40 pounds of additional weight are more than made up for in their hauling capacity on even the steepest of hills, and trip times with an ebike are usually between 20-30% faster than a regular bicycle.
185. Induction Cooker An induction cooker uses a type of induction heating for cooking. A coil of copper wire is placed underneath the cooking pot. An oscillating current is applied to this coil, which produces an oscillating magnetic field. This magnetic field creates heat in two different ways. It induces a current in an electrically conductive pot, which produces Joule (I2R) heat. It also creates magnetic hysteresis losses in a ferromagnetic pot. The former effect dominates; hysteresis losses typically account for less than ten percent of the total heat generated.
It would be possible to build an induction cooker that worked with any conductive pot (for example, an aluminum or copper pot), whether or not the pot was ferromagnetic. But the increased permeability of an iron or steel pot makes the system more practical, by increasing the inductance seen at the drive coil and by decreasing the skin depth of the current in the pot, which increases the AC resistance for the I2R heating. Most practical induction cookers are designed for ferromagnetic pots; consumers are generally advised that the cooker will work only with pots that will stick to a magnet. It would not be possible to build an induction cooker that worked with an electrically insulating (for example, glass or ceramic) pot under any conditions.
Induction cookers are faster and more energy-efficient than traditional cooktops. Unlike traditional cooktops, the pot itself is heated to the desired temperature rather than heating the stovetop, reducing the possibility of injury. Skin can be burned if it comes into contact with the pot, or by the stovetop after a pot is removed. Unlike a traditional cooktop, the maximum temperature in the system is that of the pot, which is much less capable of causing serious injury than the high temperatures of flames or red-hot electric heating elements. The induction cooker does not warm the air around it, resulting in added energy efficiency.
Since heat is being generated from an induced electric current, the range can detect when cookware is removed or its contents boil out by monitoring the voltage drop caused by resistance in the circuit. This allows additional functions, such as keeping a pot at minimal boil or automatically turning off when the cookware is removed. This form of flameless cooking has some advantages over conventional gas flame and electric cookers as it provides rapid heating, improved thermal efficiency, greater heat consistency, plus the same or greater degree of controllability as gas.In situations in which a hotplate would typically be dangerous or illegal, an induction plate is ideal as it creates no heat itself.
186. Elecric Rice Cooker The preparation of rice has traditionally been a cooking process which requires attention to ensure the rice is cooked properly. Rice cookers simplify the process by automatically controlling the heat and timing, while at the same time freeing up a heating element on the range. Although the rice cooker does not necessarily speed up the cooking process, the cook's involvement in cooking rice with a rice cooker is reduced to simply using the correct amount of water. Once the rice cooker is set to cook, the rice will be cooked with no further attention.
Typically, a rice cooker contains an insulated outer container containing a heating element, into which is fitted an inner removable bowl, which is sometimes non-stick or teflon-coated, which often has graduations marked in cups of rice (white). Whereas less expensive and older models use simple electronics and mechanical and thermal sensors, high-end rice cookers use microprocessors to control the cooking process and often incorporate a timer which can be used to set the desired "ready time". Some higher-end rice cookers use induction heating. Many rice cookers can keep rice warm safely for up to 24 hours. This helps to avoid the dangers of food poisoning due to Bacillus cereus. New rice cookers normally include a small measuring cup, and a plastic paddle for serving the cooked rice. The rice cup measure is normally 180 ml, approximately 25% smaller than the American measuring cup of 8 (US) fluid ounces / 250 ml.
Restaurants that serve a lot of rice, particularly those specializing in Asian cuisine, often use industrial size rice cookers that quickly and cheaply produce large quantities of cooked rice. A rice cooker is a standard appliance in kitchens in many Asian countries and in many Asian households; indeed a recent survey showed that over 95% of Japanese kitchens feature such a device.
The bowl in the rice cooker is usually removable, and beneath it lies a heater and a thermostat. These form the main components of the rice cooker. A spring pushes the thermostat against the bottom of the bowl, for good thermal contact to ensure accurate temperature measurement. During cooking the rice/water mixture is heated at full power. The temperature cannot go above 140°C (284°F) — as any heat put into the rice/water mixture at that point will only cause the water to boil. At the end of cooking, some of the water will have been absorbed by the rice and the rest is boiled off. Once the heating continues past that point, the temperature exceeds the boiling point. The thermostat then trips, switching the rice cooker to low power "warming" mode, keeping the rice no cooler than approximately 65°C (150°F). Simple rice cookers, like the one below, may simply turn off at that point. 187. Lakshmi Mittal Mittal was born in a Rajasthan Agrawal family and spent his initial years in India, living with his extended family on bare floors and rope beds in a house built by his grandfather. His grandfather worked for the Tarachand Ghanshyamdas Poddar, one of the leading industrial firms of India. His father started an Oil Mill in Hardoi, Uttar Pradesh as a working partner for Lala Gulab Chand, a local business tycoon. The family eventually moved to Calcutta where his father, Mohan, became a partner in a steel company and made a fortune.Mittal graduated from St. Xavier's College in Calcutta with a Bachelor of Commerce degree in Business and Accounting in 1969. Mr. Mittal began his career working in the family's steelmaking business in India, and in 1976, when the family founded its own steel business, he set out to establish its international division, beginning with the buying of a run-down plant in Indonesia. Shortly afterwards he married Usha, the daughter of a well-to-do moneylender. In 1994, due to differences with his father, mother and brothers, he branched out on his own, taking over the international operations of the Mittal steel business, which was already owned by the family. Mittal's family never spoke publicly about the reasons for the split.
Controversy erupted in 2002 as Plaid MP Adam Price exposed the link between UK prime minister Tony Blair and Mittal in the Mittal Affair, also known as 'Garbagegate' or Cash for Influence. Mittal's LNM steel company, registered in the Dutch Antilles and maintaining less than 1% of its 100,000 plus workforce in the UK, sought Blair's aid in its bid to purchase Romania's state steel industry. The letter from Blair to the Romanian government, a copy of which Price was able to obtain, hinted that the privatisation of the firm and sale to Mittal might help smooth the way for Romania's entry into the European Union.
In exchange for Blair's support Mittal, already a Labour contributor, donated £125,000 more to Labour party funds a week after the 2001 UK General Elections, while as many as six thousand Welsh steelworkers were laid off that same year, Price and others pointed out. As well as this, Mittal is a non-resident Indian residing in the United Kingdom for over 14 years. Because of this, he has been included on many unofficial Wealth-indicative lists as the richest man in the United Kingdom, when in actuality, the List held by the UK and Channel Island Treasury Authority lists no mention of name "Lakshmi (or derivatives) Mittal." Corus Group and Valkia Limited were two of the primary employers in south Wales, particularly in Ebbw Vale, Llanwern, and Port Talbot.
188. Six Pack(Rectus abdominis muscle) The rectus abdominis muscle is a paired muscle running vertically on each side of the anterior wall of the human abdomen (and in some other animals). There are two parallel muscles, separated by a midline band of connective tissue called the linea alba (white line). It extends from the pubic symphysis pubic crest inferiorly to the xiphisternum/xiphoid process and lower costal cartilages (5-7) superiorly.It is contained in the Rectus sheath.
The rectus is usually crossed by three fibrous bands licked by the tendinous inscriptions. Colloquial names for the appearance of a well-defined rectus abdominis include a "six pack" and "washboard abs", and often carry cultural connotations of superior physical fitness. While the "sixpack" is by far the most common configuration of the muscle bellies of the rectus, there exist rare anatomic variations which result in the appearance of eight (four per side) muscle segments ("eightpack"), ten, or (even rarer) asymmetrically arranged segments. All these variations are functionally equivalent.
The upper portion, attached principally to the cartilage of the fifth rib, usually has some fibers of insertion into the anterior extremity of the rib itself.Some fibers are occasionally connected with the costoxiphoid ligaments, and the side of the xiphoid process. The rectus abdominis has several sources of arterial blood supply. In reconstructive surgery terms, it is a Mathes and Nahai Type III muscle with 2 dominant pedicles. First, the inferior epigastric artery nd vein (or veins) run superiorly on the posterior surface of the rectus abdominis, enter the rectus fascia at the arcuate line, and serve the lower part of the muscle. Second, the superior epigastric artery, a terminal branch of the internal thoracic artery, supplies blood to the upper portion. Finally, numerous small segmental contributions come from the lower 6 intercostal arteries as well.
The internal oblique performs two major functions. First, it acts as an antagonist (opponent) to the diaphragm, helping to reduce the volume of the thoracic (chest) cavity during exhalation. When the diaphragm contracts, it pulls the lower wall of the chest cavity down, increasing the volume of the lungs which then fill with air. Conversely, when the internal obliques contract they compress the organs of the abdomen, pushing them up into the diaphragm which intrudes back into the chest cavity reducing the volume of the air filled lungs, producing an exhalation. 189. Leander Paes Leander Adrian Paes (born June 17, 1973) is an Indian professional tennis player who currently features in the doubles events in the ATP tour and the Davis Cup tournament. He is one of the most successful professional Indian tennis players and also the former captain of the Indian tennis team. He has won various doubles and mixed doubles events at the Tennis Grand Slam events. He is also the recipient of India's highest sporting honour, the Rajiv Gandhi Khel Ratna award in 1996–1997 and the Padmashri award in 2001 for his contribution to Tennis in India. Leander was born in Kolkata formerly known as Calcutta, India. Born to Vece Paes and Jennifer Paes and raised in Kolkata, India. His family hails from the Goan Catholic community, a Christian community in Calcutta. He was educated at La Martiniere School and St.Xavier's College Calcutta. His parents were both sportspersons. His father Vece Paes was a midfielder in the bronze medal winning Indian field hockey team at the 1972 Munich Olympics. His mother captained the Indian basketball team in the 1980 Asian basketball championship. Paes enrolled with the Britannia Amritraj Tennis Academy in Madras in 1985 where he was coached by Dave O'Meara. The academy played a key role in his early development.
Paes showed promise early in his career by winning titles at the Junior US Open and the Junior Wimbledon. He turned professional in 1991. He rose to the number 1 in the world in the junior rankings. In 1992, he reached the quarter finals of the doubles event in the 1992 Barcelona Olympics with Ramesh Krishnan. He went one better at the 1996 Atlanta Olympics, where he beat Fernando Meligeni to win the Bronze medal, thus becoming the first Indian since KD Jadhav (Bronze in 1952 Helsinki Olympics) to win an individual medal for more than 4 decades Paes cited the match as one of his greatest performances on the court, in part because his wrist was severely injured. He was awarded the highest sporting honour by the Government of India, the Rajiv Gandhi Khel Ratna in 1996. His first successful year in the ATP circuit came in 1993 when he partnered Sébastien Lareau to reach the US Open doubles semi-final. After having a moderate season in 1994 he reached the Quarter final of the 1995 Australian Open doubles with Kevin Ullyett. From 1996 he started partnering with fellow Indian Mahesh Bhupathi, which later would prove to be a winning combination. 190. Mahesh Bhupathi Mahesh Shrinivas Bhupathi born 07 June 1974 in Chennai, India) is a professional tennis player. He is married to Shvetha Jaishankar Bhupathi, an Indian model whom he met at a party. He turned professional in 1995, and in 2001, he was awarded the Padma Shri. He is among the best doubles tennis players in the world with 11 grand slam titles to his credit including mixed doubles. In 1997, he became the first Indian to win a Grand Slam tournament (with Rika Hiraki in Mixed Doubles). Bhupati is an alumnus of University of Mississippi (Ole Miss) at Oxford, MS, U.S.A. In 1999, Bhupathi won three doubles titles with Leander Paes including Roland Garros and Wimbledon. He and Leander became the first doubles team to reach finals of all four Grand Slams, the first time such a feat has been achieved in the Open era and the first time since 1952. On April 26 that year, they became the World No. 1 doubles team. Bhupathi also won the US Open mixed doubles with Ai Sugiyama of Japan.
In 2006, Bhupathi teamed with Martina Hingis in the Australian Open mixed doubles competition. Entering the tournament unseeded and as wildcards, the first-time pair defeated four seeded opponents along the way, while only dropping a single set throughout. Bhupathi and Hingis defeated the sixth-seeded team of Daniel Nestor and Elena Likhovtseva in straight sets, 6–3 6–3, to capture the championship. It was the sixth mixed doubles Grand Slam for Bhupathi, and a first for Hingis. By winning Australian Open, Bhupathi completed a career Grand Slam in mixed doubles.
In 2007, Bhupathi and Czech Radek Štěpánek reached the 2007 Australian Open men's doubles event's quarterfinals. He teamed with Štěpánek at the 2007 French Open to make the doubles semifinals, defeating dual-year defending champions Jonas Björkman and Max Mirnyi in the quarterfinals. The team lost to the eventual champions Mark Knowles and Daniel Nestor. After Wimbledon Bhupathi teamed along with Pavel Vízner to win the 2007 Canada Masters defeating the top-ranked doubles team Bob and Mike Bryan en route. After this victory, he won a tournament in New Haven with Nenad Zimonjić. At the 2007 U.S. Open, he and Zimonjić paired in doubles. After the U.S. Open, the team that beat Bhupathi and Štěpánek in the French Open semifinals, Mark Knowles and Daniel Nestor, split. Bhupathi was to become the new partner of Mark Knowles while Zimonjić will partner Nestor, but back surgery means he is now expected to be out until the end of the year.
191. Nuclear Energy Nuclear energy is released by the splitting (fission) or merging together (fusion) of the nuclei of atom(s). The conversion of nuclear mass to energy is consistent with the mass-energy equivalence formula ΔE = Δm.c², in which ΔE = energy release, Δm = mass defect, and c = the speed of light in a vacuum (a physical constant). Nuclear energy was first discovered by French physicist Henri Becquerel in 1896, when he found that photographic plates stored in the dark near uranium were blackened like X-ray plates, which had been just recently discovered at the time 1895.
Nuclear chemistry can be used as a form of alchemy to turn lead into gold or change any atom to any other atom (albeit through many steps). Radionuclide (radioisotope) production often involves irradiation of another isotope (or more precisely a nuclide), with alpha particles, beta particles, or gamma rays. Iron has the highest binding energy per nucleon of any atom. If an atom of lower average binding energy is changed into an atom of higher average binding energy, energy is given off. The chart shows that fusion of hydrogen, the combination to form heavier atoms, releases energy, as does fission of uranium, the breaking up of a larger nucleus into smaller parts. Stability varies between isotopes: the isotope U-235 is much less stable than the more common U-238.
The nuclear force has been at the heart of nuclear physics ever since the field was born in 1932 with the discovery of the neutron by James Chadwick. The traditional goal of nuclear physics is to understand the properties of atomic nuclei in terms of the 'bare' interaction between pairs of nucleons, or nucleon-nucleon forces.
In 1935, Hideki Yukawa made the earliest attempt to explain the nature of the nuclear force. According to his theory, massive bosons (mesons) mediate the interaction between two nucleons. Although, in light of QCD, meson theory is no longer perceived as fundamental, the meson-exchange concept (where hadrons are treated as elementary particles) continues to represent the best working model for a quantitative NN potential.
Historically, it was a formidable task to describe the nuclear force phenomenologically, and the first semi-empirical quantitative models came in the mid-1950s. There has been substantial progress in experiment and theory related to the nuclear force. Most basic questions were settled in the 1960s and 1970s. In recent years, experimenters have concentrated on the subtleties of the nuclear force, such as its charge dependence, the precise value of the πNN coupling constant, improved phase shift analysis, high-precision NN data, high-precision NN potentials, NN scattering at intermediate and high energies, and attempts to derive the nuclear force from QCD.
192. Michael Schumacher Michael Schumacher (born January 3, 1969, in Hürth-Hermülheim, Germany) is a former Formula One driver and seven-time Formula One world drivers' champion, and current advisor and occasional test driver for Ferrari. According to the official Formula One website, he is "statistically the greatest driver the sport has ever seen".He is the only German to win the Formula One World championship, and is credited with popularising Formula One in Germany. In a 2006 FIA survey, Michael Schumacher was voted the most popular driver of the season among Formula One fans.
After winning two championships with Benetton, Michael Schumacher moved to Scuderia Ferrari in 1996 and won five consecutive drivers' titles with them from 2000–2004. Schumacher holds many records in Formula One, including most drivers' championships, race victories, fastest laps, pole positions, points scored and most races won in a single season. Schumacher is the only Formula One driver to have an entire season of podium finishes, a feat he accomplished in 2002. His driving sometimes created controversy: he was twice involved in collisions that determined the outcome of the world championship, most notably his disqualification from the 1997 championship for causing a collision with Jacques Villeneuve. After the 2006 Formula One season Schumacher retired from race driving.
Off the track, Schumacher is an ambassador for UNESCO and a spokesman for driver safety. He has been involved in numerous humanitarian efforts throughout his life and donated tens of millions of dollars to charity. He is the elder brother of former F1 driver Ralf Schumacher, currently racing in Deutsche Tourenwagen Masters (DTM). They stand as the only brothers in F1 history to have both won races and scoring the first ever 1-2 finish in Formula One.
Schumacher was noted throughout his career for his ability to produce fast laps at crucial moments in a race, to push his car to the very limit for sustained periods. Motor sport author Christopher Hilton observed in 2003 that "A measure of a driver's capabilities is his performance in wet races, because the most delicate car control and sensitivity are needed," and noted that like other great drivers, Schumacher's record in wet conditions shows very few mistakes: up to the end of the 2003 season, Schumacher won 17 of the 30 races in wet conditions he contested. Some of Schumacher's best performances occurred in such conditions, earning him the title "Regenkönig" (rain king) or "Regenmeister" (rain master).
193. Valentino Rossi Valentino Rossi (born February 16, 1979 in Urbino) is an Italian professional motorcycle racer and multiple MotoGP World Champion. He is one of the most successful motorcycle racers of all time, with 8 Grand Prix World Championships to his name. According to Sports Illustrated, Rossi is one of the highest earning sports personalities in the world, having earned an estimated $34 million in 2007.
Following his father, Graziano Rossi, Rossi started racing in Grand Prix in 1996 for Aprilia in the 125cc category and won his first World Championship the following year. From there, he moved up to the 250 cc category, again with Aprilia, and won the World Championship in 1999. He won the 500 cc World Championship with Honda in 2001, the MotoGP World Championships (also with Honda) in 2002 and 2003, and continued his streak of back-to-back championships by winning the 2004 and 2005 titles after leaving Honda to join Yamaha, before regaining the title in 2008. In 1994, Aprilia by way of Sandroni, used Rossi to improve its RS125R and in turn allowed Rossi to learn how to handle the fast new pace of 125 cc racing. At first he found himself on a Sandroni in the 1994 Italian championship and continued to ride it through the 1995 European and Italian championships.
Rossi had variable success in the 1996 World Championship season, failing to finish five of the season's races and crashing several times. Despite this, in August, he won his first World Championship Grand Prix at Brno in the Czech Republic on an AGV Aprilia RS125R. He finished the season in ninth position. Rossi treated it as a learning process and refined his skills enough to completely dominate the 125cc World Championship in the following 1997 season, winning 11 of the 15 races. Inaugural year for the MotoGP bikes was 2002, when riders experienced teething problems getting used to the new bikes (or dealing with the inferior 500 cc bikes). Rossi won the first race and went on to win eight of the first nine races of the season, eventually claiming 11 victories in total.
It was more of the same in 2003 for Rossi's rivals when he claimed nine pole positions as well as nine GP wins to claim his third consecutive World Championship. The Australian GP at Phillip Island in 2003 is considered to be one of Rossi's greatest career moments due to unique circumstances. After being given a 10-second penalty for overtaking during a yellow flag due to a crash by Ducati rider Troy Bayliss, front runner Rossi proceeded to pull away from the rest of the field, eventually finishing more than 15 seconds ahead, more than enough to cancel out the penalty and win the race.
194. Pacific Ocean The Pacific Ocean is the largest of the Earth's oceanic divisions. Its name is derived from the Latin name Mare Pacificum, "peaceful sea", bestowed upon it by the Portuguese explorer Ferdinand Magellan. It extends from the Arctic in the north to Antarctica in the south, bounded by Asia and Australia in the west, and the Americas in the east. At 169.2 million square kilometres (65.3 million square miles) in area, this largest division of the World Ocean – and, in turn, the hydrosphere – covers about 46% of the Earth's water surface and about 32% of its total surface area, making it larger than all of the Earth's land area combined. The equator subdivides it into the North Pacific Ocean and South Pacific Ocean. The Mariana Trench in the western North Pacific is the deepest point in the Pacific and in the world, reaching a depth of 10,911 metres (35,798 ft).
The ocean encompasses almost a third of the Earth's surface, having an area of 179.7 million square kilometres (69.4 million sq mi and 161 million cubic mi) —significantly larger than Earth's entire landmass, with room for another Africa to spare. Extending approximately 15,500 kilometres (9,600 mi) from the Bering Sea in the Arctic to the icy margins of Antarctica's Ross Sea in the south (although the Antarctic regions of the Pacific are sometimes described as part of the circumpolar Southern Ocean), the Pacific reaches its greatest east-west width at about 5°N latitude, where it stretches approximately 19,800 kilometres (12,300 mi) from Indonesia to the coast of Colombia and Peru - halfway across the world, and more than five times the diameter of the Moon. The western limit of the ocean is often placed at the Strait of Malacca.The lowest point on earth—the Mariana Trench—lies 10,911 metres (35,797 ft) below sea level. Its average depth is 4,280 metres (14,000 ft).The Pacific contains about 25,000 islands (more than the total number in the rest of the world's oceans combined), the majority of which are found south of the equator. Including partially submerged islands, the figure is substantially higher.
Along the Pacific Ocean's irregular western margins lie many seas, the largest of which are the Celebes Sea, Coral Sea, East China Sea, Philippine Sea, Sea of Japan, South China Sea, Sulu Sea, Tasman Sea, and Yellow Sea. The Strait of Malacca joins the Pacific and the Indian Oceans on the west, and Drake Passage and the Straits of Magellan link the Pacific with the Atlantic Ocean on the east. To the north, the Bering Strait connects the Pacific with the Arctic Ocean.
195. Volcano A volcano is an opening, or rupture, in a planet's surface or crust, which allows hot, molten rock, ash, and gases to escape from below the surface. Volcanic activity involving the extrusion of rock tends to form mountains or features like mountains over a period of time. The Ancient Romans called volcanoes Vulcano, after Vulcan, their fire god.
Volcanoes are generally found where tectonic plates are diverging or converging. A mid-oceanic ridge, for example the Mid-Atlantic Ridge, has examples of volcanoes caused by divergent tectonic plates pulling apart; the Pacific Ring of Fire has examples of volcanoes caused by convergent tectonic plates coming together. By contrast, volcanoes are usually not created where two tectonic plates slide past one another. Volcanoes can also form where there is stretching and thinning of the Earth's crust (called "non-hotspot intraplate volcanism"), such as in the African Rift Valley, the Wells Gray-Clearwater volcanic field and the Rio Grande Rift in North America and the European Rhine Graben with its Eifel volcanoes.
Volcanoes can be caused by mantle plumes. These so-called hotspots, for example at Hawaii, can occur far from plate boundaries. Hotspot volcanoes are also found elsewhere in the solar system, especially on rocky planets and moons Volcanic cones or cinder cones result from eruptions that erupt mostly small pieces of scoria and pyroclastics (both resemble cinders, hence the name of this volcano type) that build up around the vent. These can be relatively short-lived eruptions that produce a cone-shaped hill perhaps 30 to 400 meters high. Most cinder cones erupt only once. Cinder cones may form as flank vents on larger volcanoes, or occur on their own. Parícutin in Mexico and Sunset Crater in Arizona are examples of cinder cones. In New Mexico, Caja del Rio is a volcanic field of over 60 cinder cones.
Hotspots are not usually located on the ridges of tectonic plates, but above mantle plumes, where the convection of the Earth's mantle creates a column of hot material that rises until it reaches the crust, which tends to be thinner than in other areas of the Earth. The temperature of the plume causes the crust to melt and form pipes, which can vent magma. Because the tectonic plates move whereas the mantle plume remains in the same place, each volcano becomes dormant after a while and a new volcano is then formed as the plate shifts over the hotspot. The Hawaiian Islands are thought to be formed in such a manner, as well as the Snake River Plain, with the Yellowstone Caldera being the part of the North American plate currently above the hot spot.
196. Pyramid A pyramid is a building where the outer surfaces are triangular and converge at a point. The base of a pyramid is usually trilateral or quadrilateral (but may be of any polygon shape), meaning that a pyramid usually has four or five faces. A pyramid's design, with the majority of the weight closer to the ground, means that less material higher up on the pyramid will be pushing down from above: this allowed early civilizations to create stable monumental structures.
For thousands of years, the largest structures on Earth were pyramids: first the Red Pyramid in the Dashur Necropolis and then the Great Pyramid of Khufu, the only one of the Seven Wonders of the Ancient World still remaining. The largest pyramid ever built, by volume, is the Great Pyramid of Cholula, in the Mexican state of Puebla. This pyramid is still being excavated.
The most famous pyramids are the Egyptian pyramids— huge structures built of brick or stone, some of which are among the largest constructions. There are 138 pyramids discovered in Egypt as of 2008. The Great Pyramid of Giza is the largest in Egypt and one of the largest in the world. Until Lincoln Cathedral was built in 1400 AD, it was the tallest building in the world. The base is over 52,600 square meters in area. Egypt has the most pyramids in the world, with Sudan coming in at a close second. It was one of the Seven Wonders of the World, and the only one of the seven to survive into modern times. The Ancient Egyptians covered their faces with polished white limestone, though most of the stones used for the purpose have fallen or have been removed and used to build the mosques of Cairo. Pausanias, a Greek traveler in the second century AD described several of the structures as pyramids. One of these pyramids was located in Hellenikon (Ελληνικό in Greek), a village near Argos near the ancient ruins of Tiryns. The story surrounding the monument was that it was built as a polyandria, a common grave, for those soldiers who had fallen in the struggle for the throne of Argos back in the 14th Century BC He described the structure as something that resembled a pyramid with the decorations of Argolic shields, showing the military connection to it. Another pyramid that Pausanias saw on his journeys was at Kenchreai, another polyandria dedicated to the Argives and Spartans who lost their lives at the Battle of Hysiai in 669 BC.
197. Great Wall The Great Wall of China; literally "The long wall of 10,000 Li"is a series of stone and earthen fortifications in northern China, built, rebuilt, and maintained between the 5th century BC and the 16th century to protect the northern borders of the Chinese Empire from Xiongnu attacks during various successive dynasties. Since the 5th century BC, several walls have been built that were referred to as the Great Wall. One of the most famous is the wall built between 220–206 BC by the first Emperor of China, Qin Shi Huang. Little of that wall remains; it lay farther north than the current wall, which was built during the Ming Dynasty.
The Great Wall stretches over approximately 6,400 km (4,000 miles) from Shanhaiguan in the east to Lop Nur in the west, along an arc that roughly delineates the southern edge of Inner Mongolia, but stretches to over 6,700 km (4,160 miles) in total; a more recent archaeological survey using advanced technologies points out that the entire Great Wall, with all of its branches, stretches for 8,851.8 km (5,500.3 mi). At its peak, the Ming Wall was guarded by more than one million men.It has been estimated that somewhere in the range of 2 to 3 million Chinese died as part of the centuries-long project of building the wall
The Chinese were already familiar with the techniques of wall-building by the time of the Spring and Autumn Period, which began around the 8th century BC. During the Warring States Period from the 5th century BC to 221 BC, the states of Qi, Yan and Zhao all constructed extensive fortifications to defend their own borders. Built to withstand the attack of small arms such as swords and spears, these walls were made mostly by stamping earth and gravel between board frames. Qin Shi Huang conquered all opposing states and unified China in 221 BC, establishing the Qin Dynasty. Intending to impose centralized rule and prevent the resurgence of feudal lords, he ordered the destruction of the wall sections that divided his empire along the former state borders. To protect the empire against intrusions by the Xiongnu people from the north, he ordered the building of a new wall to connect the remaining fortifications along the empire's new northern frontier. Transporting the large quantity of materials required for construction was difficult, so builders always tried to use local resources. Stones from the mountains were used over mountain ranges, while rammed earth was used for construction in the plains. There are no surviving historical records indicating the exact length and course of the Qin Dynasty walls. Most of the ancient walls have eroded away over the centuries, and very few sections remain today. Later, the Han, Sui, Northern and Jin dynasties all repaired, rebuilt, or expanded sections of the Great Wall at great cost to defend themselves against northern invaders.
198. Oscar Award The Academy Awards, popularly known as the Oscars, are presented annually by the American Academy of Motion Picture Arts and Sciences (AMPAS) to recognize excellence of professionals in the film industry, including directors, actors, and writers. The formal ceremony at which the awards are presented is one of the most prominent award ceremonies in the world. The Academy of Motion Picture Arts and Sciences itself was conceived by Metro-Goldwyn-Mayer studio boss Louis B. Mayer.
The 1st Academy Awards ceremony was held on Thursday, May 16, 1929, at the Hotel Roosevelt in Hollywood to honor outstanding film achievements of 1927 and 1928. It was hosted by actor Douglas Fairbanks and director William C. deMille. The 81st Academy Awards, honoring the best in film for 2008, was held on Sunday, February 22, 2009, at the Kodak Theatre in Hollywood, with actor Hugh Jackman hosting the ceremony
The first awards were presented on May 16, 1928, at a private dinner in Hollywood with an audience of fewer than 1,000 people. Since the first year, the awards have been publicly broadcast, at first by radio then by TV after 1953. During the first decade, the results were given to newspapers for publication at 11 p.m. on the night of the awards. This method was ruined when the Los Angeles Times announced the winners before the ceremony began; as a result, the Academy has since used a sealed envelope to reveal the name of the winners. Since 2002, the awards have been broadcast from the Kodak Theatre. The official name of the Oscar statuette is the Academy Award of Merit. Made of gold-plated britannium on a black metal base, it is 13.5 in (34 cm) tall, weighs 8.5 lb (3.85 kg) and depicts a knight rendered in Art Deco style holding a crusader's sword standing on a reel of film with five spokes. The five spokes each represent the original branches of the Academy: Actors, Writers, Directors, Producers, and Technicians.MGM's art director Cedric Gibbons, one of the original Academy members, supervised the design of the award trophy by printing the design on a scroll. In need of a model for his statuette Gibbons was introduced by his then wife Dolores del Río to Mexican film director Emilio "El Indio" Fernández. Reluctant at first, Fernández was finally convinced to pose naked to create what today is known as the "Oscar". Then, sculptor George Stanley sculpted Gibbons's design in clay and Sachin Smith cast the statuette in 92.5 percent tin and 7.5 percent copper and then gold-plated it. The only addition to the Oscar since it was created is a minor streamlining of the base.
199. Nobel Prize The Nobel Prize is a Swedish prize, established in the 1895 will of Swedish chemist and inventor Alfred Nobel; it was first awarded in Physics, Chemistry, Physiology or Medicine, Literature, and Peace in 1901. An associated prize, The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel, was instituted by Sweden's central bank in 1968 and first awarded in 1969. The Nobel Prizes in the specific disciplines (Physics, Chemistry, Physiology or Medicine, and Literature) and the Prize in Economics, which is commonly identified with them, are widely regarded as the most prestigious award one can receive in those fields. The Nobel Peace Prize conveys social prestige and is often politically controversial.
With the exception of the Nobel Peace Prize, the Nobel Prizes and the Prize in Economics are presented in Stockholm, Sweden, at the annual Prize Award Ceremony on the 10th of December, the anniversary of Nobel's death. The recipients' lectures are presented in the days prior to the award ceremony. The Nobel Peace Prize and its recipients' lectures are presented at the annual Prize Award Ceremony in Oslo, Norway, also on the 10th of December. The reason why Norway distributes a part of the prize is that at the time of Alfred Nobel's death, Norway and Sweden were joined together in a personal union known as the Swedish-Norwegian Union. The award ceremonies and the associated banquets are nowadays major international events. Five Nobel Prizes were instituted by the final will of Alfred Nobel, a Swedish chemist and industrialist, who was the inventor of the high explosive dynamite. Though Nobel wrote several wills during his lifetime, the last was written a little over a year before he died, and signed at the Swedish-Norwegian Club in Paris on 27 November 1895. Nobel bequeathed 94% of his total assets, 31 million Swedish Kronor, to establish and endow the five Nobel Prizes. Compared with some other prizes, the Prize nomination and selection process is long and rigorous. This is a key reason why the Prizes have grown in importance over the years to become the most important prizes in their field. The Nobel laureates are selected by their respective committees. For the Prizes in Physics, Chemistry and Economics, a committee consists of five members elected by The Royal Swedish Academy of Sciences; for the Prize in Literature, a committee of four to five members of the Swedish Academy; for the Prize in Physiology or Medicine, the committee consists of five members selected by The Nobel Assembly, which consists of 50 members elected by Karolinska Institutet; for the Peace Prize, the Norwegian Nobel Committee consists of five members elected by the Norwegian Storting (the Norwegian parliament). In its first stage, several thousand people are asked to nominate candidates.
200. A.P.J.Abdul Kalam Avul Pakir Jainulabdeen Abdul Kalam born October 15, 1931, Tamil Nadu, India, usually referred to as Dr. A. P. J. Abdul Kalam, was the eleventh President of India, serving from 2002 to 2007. During his term as The President, he was popularly known as the People's President, and a poll conducted by news channel CNN-IBN named him "India's Best President".Before his term as India's president, he worked as an aeronautical engineer with DRDO and ISRO. He is popularly known as the Missile Man of India for his work on development of ballistic missile and space rocket technology.. In India he is highly respected as a scientist and as an engineer.
Kalam played a pivotal organizational, technical and political role in India's Pokhran-II nuclear test in 1998, the first since the original nuclear test by India in 1974. He is a professor at Anna University (Chennai) and adjunct/visiting faculty at many other academic and research institutions across India.With the death of R. Venkataraman on January 27, 2009, Kalam became the only surviving former President of India. Kalam's father was a devout Muslim, who owned boats which he rented out to local fishermen and was a good friend of Hindu religious leaders and the school teachers at Rameshwaram. APJ Abdul Kalam mentions in his biography that to support his studies, he started his career as a newspaper vendor. This was also told in the book, A Boy and His Dream: Three Stories from the Childhood of Abdul Kalam by Vinita Krishna. The house Kalam was born in can still be found on the Mosque street in Rameshwaram, and his brother's curio shop abuts it. This has become a point-of-call for tourists who seek out the place. Kalam grew up in an intimate relationship with nature, and he says in Wings of Fire that he never could imagine that water could be so powerful a destroying force as that he witnessed when he was thirty three. That was in 1964 when a cyclonic storm swept away the Pamban bridge and a trainload of passengers with it and also Kalam's native village, Dhanushkodi.
He is a scholar of Thirukkural; in most of his speeches, he quotes at least one kural. Kalam has written several inspirational books, most notably his autobiography Wings of Fie, aimed at motivating Indian youth. Another of his books, Guiding Souls: Dialogues on the Purpose of Life reveals his spiritual side. He has written poems in Tamil as well. It has been reported that there is considerable demand in South Korea for translated versions of books authored by him.
.
201. A.R.Rahman Allah Rakha Rahman ( born January 6, 1966 as A. S. Dileep Kumar in Chennai, Tamil Nadu, India) is an Indian film composer, record producer, musician and singer. His film scoring career began in the early 1990s. He has won thirteen Filmfare Awards, four National Film Awards, a BAFTA Award, a Golden Globe and two Academy Awards.
Working in India's various film industries, international cinema and theatre, by 2003, Rahman, in a career spanning over a decade, has sold more than 100 million records of his film scores and soundtracks worldwide, and sold over 200 million cassettes,making him one of the world's all-time top selling recording artists.Time magazine has referred to him as the "Mozart of Madras" and several Tamil commentators have coined him the nickname Isai Puyal (English: Music Storm).In 2009, the magazine placed Rahman in the Time 100 list of 'World's Most Influential People'.
A. R. Rahman was born in a musically affluent Tamil family. His father R. K. Shekhar, was a Chennai based composer and conductor for Malayalam films. Rahman lost his father at a young age and his family rented out musical equipment as a source of income. He was raised by his mother Kareema (Kashturi), who was from a Muslim family. During these formative years, Rahman served as a keyboard player and an arranger in bands such as "Roots", with childhood friend and percussionist Sivamani, John Anthony, Suresh Peters, JoJo and Raja. Rahman is the founder of the Chennai-based rock group, "Nemesis Avenue".He played the keyboard and piano, the synthesizer, the harmonium and the guitar. His curiosity in the synthesizer, in particular increased because, he says, it was the “ideal combination of music and technology". He began early training in music under Master Dhanraj. At the age of 11, he joined, as a keyboardist, the troupe of Ilaiyaraaja,one of many composers to whom musical instruments belonging to Rahman's father were rented. Rahman later played in the orchestra of M. S. Viswanathan and Ramesh Naidu, accompanied Zakir Hussain, Kunnakudi Vaidyanathan and L. Shankar on world tours and obtained a scholarship to the Trinity College of Music where he graduated with a degree in Western classical music.
202. Foot Ball
Football is the word given to a number of similar team sports, all of which involve (to varying degrees) kicking a ball in an attempt to score a goal. The most popular of these sports worldwide is association football, more commonly known as just "football" or "soccer". The English language word "football" is also applied to "gridiron football" (a name associated with the North American sports, especially American football and Canadian football), Australian football, Gaelic football, rugby football (rugby league and rugby union), and related games. Each of these codes (specific sets of rules, or the games defined by them) is referred to as "football".In most codes, there are rules restricting the movement of players offside, and players scoring a goal must put the ball either under or over a crossbar between the goalposts. Other features common to several football codes include: points being mostly scored by players carrying the ball across the goal line and; players receiving a free kick after they take a mar/make a fair catch.
Peoples from around the world have played games which involved kicking and/or carrying a ball, since ancient times. However, most of the modern codes of football have their origins in England. While it is widely believed that the word "football" (or "foot ball") originated in reference to the action of the foot kicking a ball, there is a rival explanation, which has it that football originally referred to a variety of games in medieval Europe, which were played on foot. These games were usually played by peasants, as opposed to the horse-riding sports often played by aristocrats. While there is no conclusive evidence for this explanation, the word football has always implied a variety of games played on foot, not just those that involved kicking a ball. In some cases, the word football has even been applied to games which have specifically outlawed kicking the ball.
The Ancient Greeks and Romans are known to have played many ball games, some of which involved the use of the feet. The Roman game harpastum is believed to have been adapted from a team game known as "επισκυρος" (episkyros) or phaininda, which is mentioned by a Greek playwright, Antiphanes (388–311 BC) and later referred to by the Christian theologian Clement of Alexandria (c.150-c.215 AD). The Roman politician Cicero (106-43 BC) describes the case of a man who was killed whilst having a shave when a ball was kicked into a barber's shop. These games appear to have resembled rugby football. Roman ball games already knew the air-filled ball, the follis.
Documented evidence of an activity resembling football can be found in the Chinese military manual Zhan Guo Ce compiled between the 3rd century and 1st century BC. It describes a practice known as cuju ( literally "kick ball"), which originally involved kicking a leather ball through a small hole in a piece of silk cloth which was fixed on bamboo canes and hung about 9 m above ground. During the Han Dynasty(206 BC–220 AD), cuju games were standardized and rules were established.
203. Ronaldinho Ronaldo de Assis Moreira (born 21 March 1980 in Porto Alegre), commonly known as Ronaldinho or Ronaldinho Gaúcho, is a Brazilian footballer who plays for Italian Serie A side Milan and the Brazilian national team.Ronaldinho, Portuguese for "Little Ronaldo," is known in Brazil by the nickname "Gaúcho," in order to distinguish him from Ronaldo, who was already called "Ronaldinho" in Brazil. Ronaldo simply went by his first name upon his move to Europe, thereby allowing Ronaldinho to drop the "Gaúcho" and remain simply as Ronaldinho.Prior to his move to Milan, he played for Paris Saint-Germain, and FC Barcelona, with whom he won his first Champions League in 2006. He became a Spanish citizen in January 2007.
Ronaldinho was born in city of Porto Alegre, capital of the Rio Grande do Sul state of Brazil. His mother, Dona Miguelina, is a former salesperson who studied to become a nurse. His father, João, was a shipyard worker and footballer for local club Esporte Clube Cruzeiro (not to be confused with Cruzeiro EC). He suffered a fatal heart attack in the family swimming pool when Ronaldinho was eight. After Ronaldinho's older brother, Roberto, signed with Grêmio, the family moved to a home in the more affluent Guarujá section of Porto Alegre, which was a gift from Grêmio to convince Roberto to stay at the club. Roberto's career was ultimately cut short by injury.
Ronaldinho's football skills began to blossom at an early age, and he was first given the nickname Ronaldinho because he was often the youngest and the smallest player in youth club matches. He developed an interest in futsal and beach football, which later expanded to organized football. His first brush with the media came at the age of thirteen, when he scored all 23 goals in a 23-0 victory against a local team. Ronaldinho was identified as a rising star at the 1997 U-17 World Championship in Egypt, in which he scored two goals on penalty kicks.
Today, Roberto acts as Ronaldinho's manager, while his sister, Deisi, works as his press coordinator. Ronaldinho became a father for the first time on 25 February 2005, after Brazilian dancer Janaína Mendes gave birth to their son, who was named João after Ronaldinho's late father. Ronaldinho's career began with the Grêmio youth squad under head coach Liam Higgins. He made his senior side debut during the 1998 Copa Libertadores. In 2001, Arsenal expressed interest in signing Ronaldinho, but the move collapsed after he could not obtain a work permit because he was a non-EU player who had not played enough international matches. He considered playing on loan with Scottish Premier League side St. Mirren, which never happened due to his involvement in a fake passport scandal in Brazil. In 2001, Ronaldinho signed a five-year contract with French side Paris Saint-Germain.
204. Ronaldo Ronaldo Luís Nazário de Lima ; born September 18, 1976), more commonly known as Ronaldo, is a Brazilian professional footballer currently playing for Campeonato Brasileiro club Corinthians.In 1993, Ronaldo began his professional football career playing for Cruzeiro which was already going on to become a successful club. In his first and only year with Cruzeiro, he amassed 12 goals in 14 appearances and lead them to their first Copa do Brasil championship. After Cruzeiro, he signed with Dutch football team PSV in 1994. In 1996, his final year with PSV, Ronaldo helped them win the Dutch Cup. Ronaldo has also played for Barcelona, Milan, Real Madrid and Internazionale.
Ronaldo is also an established national footballer for Brazil. He has appeared in 97 international matches, amassing 62 goals and standing 15 goals away from the Brazilian national scoring record. He was a part of the Brazilian squad that won the 1994, and 2002 World Cups. During the 2006 World Cup, Ronaldo became the highest goalscorer in the history of the World Cup with his fifteenth goal, surpassing Gerd Müller's previous record of 14.
Nicknamed O Fenômeno ("The Phenomenon" in English), Ronaldo was also one of the most prolific scorers in the nineties and in the early twenty-first century. During his football career in Europe, Ronaldo became one of the most renowned strikers in the world by winning his first Ballon d'Or as the European Footballer of the Year in 1997 and again in 2002. He is also one of only two men to have won the FIFA Player of the Year award three times, along with French footballer Zinédine Zidane. In 2007, he was named as one of the best starting eleven of all-time by France Football and was named to the FIFA 100, a list of the greatest footballers compiled by fellow countryman Pelé.
After being scouted by Piet de Visser, he was soon transferred for US$6 million to PSV, where he scored 42 goals in 46 league games and reached a total of 54 goals in 57 official appearances. With PSV, Ronaldo won the Dutch Cup in 1996 and was Eredivisie top scorer in 1995. Later, he attracted the attention of Barcelona. He played for Barça in the 1996-97 season, scoring 47 goals in 49 games (including appearances in the Spanish Cup & Spanish Supercup and European Cup Winners Cup) on the way to leading the Catalan side to UEFA Cup Winners' Cup triumph (where he capped the season with the winning goal in the cup final itself) and to Spanish Cup and Spanish Supercup. He also won La Liga top scorer award in 1997 with 34 goals in 37 games. Until the 08/09 season, Ronaldo remained the last player to score more than 30 goals in La Liga. Diego Forlan netted 32 goals but Ronaldo's total of 34 is still unmatched.
205. Online Business Directory
HealthcareMagic is a leading consumer-centric health company founded to transform how people approach their overall health and wellness. ...
eBusiness Indya Pvt.Ltd. specializes in providing E-Mail Marketing, Mail/Postal Marketing, SMS Marketing, Internet Marketing, Database Services, Web Design, Multimedia Presentations, E?Business Solutions & Total Software Solutions tailored to meet sp...
Nri-Gujarati.com, is a premium web portal to satisfy needs of a Gujarati across the world. Nri-Gujarati Nri-Gujarati.com, is a portal to bridge gaps for Gujaratis world-wide. Here we are offering a common platform to connect all Gujaratis across the ...
Mumbai Mirror Online ,part of The Times of India offering daily news and happenings in and around Mumbai. The Times of India is nation's leading media conglomerate with 45 dailies and periodicals in 3 languages and 108 editions from 9 centres and a c...LocalDailyDiscounts.com is your online coupon service provider that allows you to advertise specials up to the minute each and every day of the year. No more waiting for a coupon to go to print or being stuck with a program that is not working for yo...
206. Business Programs
Internet entrepreneurs have discovered many online profit-making business ideas and affiliate programs. One needs to watch out many "fake" online business ideas and affiliate programs offering free website set up and almost everything is seemingly free.
One also needs to be careful when looking for an online business because there are many scams out there. I have examined and compiled some home business ideas and affiliate programs. One of the trustworthy home business screening websites is Rip-off Report. The site claims to have helped consumers settle their disputes with the offending companies. All you have to do is to go to Rip-off Report.com's website, type in the company or website's name in its search engine, then you'll see some analyses. It costs nothing to find out information from Rip-off Report.
It is stated in Rip-off Report's website that FBI, FTC, SEC, IRS, Homeland Security, US Postal Inspectors, Justice Department, and government agencies from over 9 other countries use its service. A spokesman for the New York State Consumer Protection Board, says he uses the site every time a couple of new complaints come into his agency. A search at Rip-off Report immediately tells him if there's a new scam trend operating, he said. He thinks consumers should use the site the same way.
Anyone can post a complain about a company on their personal website. But the power of Rip-off Report is the quick placement in search engine results. Well, the site is not perfect. One needs to take precaution just like examining anything on the Internet.
207. Work From Home
Coaching is a low-cost business that can be rewarding and yield potentially high returns. As a coach, your role is to guide people to achieve success in their careers, relationships and life. You help identify the problems, challenges and weaknesses they face in their life or in their business, and help them overcome these challenges.
There are many opportunities for earning money from home – from buying into a turnkey business opportunity system, finding work at home employment, starting a home-based business, or even buying a home-based franchise. The key is to know what interests you, what you’d like to do, and what your resources can allow you to do.
There are a number of franchises that you can operate from your home, from business consulting to home-based pet services to home services. Home-based franchises typically have lower cost to entry compared to traditional franchises given the lower franchise fees.
The skills that you employ (or used to employ) in your day job can be used to help you earn money on the side. Whether you are doingweb design , writing, desktop publishing , or computer repair , you can use these skills and work on the side, possibly on a part-time basis.
208. Internet Marketing Lessons
"eBook Camp: Proven Internet Marketing Techniques to Grow Your Business" by Corey Perlman is an excellent primer on internet marketing and searchengine optimization for beginners. The book covers all the essentials of Internet marketing, from the effective use of metatags to the right way of using the social networks to the steps needed to generate links to the site. It focuses on low-cost and easy-to-implement strategies that can increase search engine ranking and boost the traffic
The strong point of the eBook Camp book, and what sets it apart from similar basics of Internet marketing books, is how it concretizes the must-learn lessons of the book.
In addition, the book’s Walkthrough and Exercise sections provide a practical guide to readers on how to implement the concepts in the chapter. It’s one thing to know about the importance of link popularity, and another to actually know how to start the process of boosting their link popularity.
The book is targeted to the low-budget newbie online marketers. Hence, there are recommendations that focus on keeping everything on a shoestring budget –e.g. not spending more than $15 to host a 6-8 page informational site. A more advanced online marketer may find the book’s teachings to be too simple even outdated. However, even seasoned Internet marketer can benefit from reading the Tips for Success, if only to be reminded of the basics of Internet marketing and search engine optimization.
209. Building Your Business Link
Write an article to be used by another web site, with your link in the resource box below. Also write testimonials to web sites whose products or services you have used with a link to your website. Consider the popularity of the web site when deciding what sites to ask for a link.
Each of the pages in your web site has the opportunity to appear in the search engines. Therefore, each page must be given their unique title tag that includes the important keywords of the page.
Be sure to do a little selling in your description tag. You only have two sentences (title tag and description tag) to convince search engine users to click on your site instead of the other results
Use all the features that sites such as MySpace, Facebook or LinkedIn and post videos of customer testimonials or post pictures of your business operations. Ask your web designer to make sure your site can be viewed clearly in all browsers.
210. Formulation Of Your Business Plan
The primary value of your business plan will be to create a written outline that evaluates all aspects of the economic viability of your business venture including a description and analysis of your business prospects.
Since the My Own Business, Inc. course is broken down into 14 of the most important topics to consider in starting or operating a business, your business plan can easily be organized into this same format. Included in this session, and in each of the following sessions, there is one-page business plan template.
The most important ingredient for your success will be yourself. Focus on how your prior experiences will be applicable to your new business. Prepare a résumé of yourself and one for each person who will be involved with you in starting the business. Be factual and avoid hype. This part of your Business Plan will be read very carefully by those with whom you will be having relationships, including lenders, investors and vendors. Templates for preparing résumés are available in your library, Kinko's, bookstores and the Internet under "résumés."
Provide a complete assessment of the economic environment in which your business will become a part. Explain how your business will be appropriate for the regulatory agencies and demographics with which you will be dealing. If appropriate, provide demographic studies and traffic flow data normally available from local planning departments.
211. Phone Card Web Site
Ever dream of having your own online business that can generate more income than your current day job? Ever wanted a home-based business that you can manage from virtually anywhere, even when your on vacations? We have just what you need! Join our E-trepreneur program today and get on board $4 billion dollar prepaid phone card industry, with 400 million in growth per year! (Intelecard news). Our E-trepreneur program provides all the tools and knowledge for you to succeed in work at home prepaid phone card business. We promise that we have taken away all the hassle and burden of starting an online phone card business. This isn't just another work-at-home program with no benefits. We have affiliates making over 300,000.00 per year. There's no risk!
212. Learning From Big Boys
Tony Hseih, CEO of Zappos.com, is ranked by Twitterholic.com as #36 with 628,707 followers. In one interview, Hseih revealed that his company doesn't look at Twitter to drive additional traffic. In fact, his posts are mostly his personal thoughts, updates of his day, events at the office -- instead of straight out promoting Zappo.com deals. Twitter for Zappos is more about branding, serving a way customers (and even employees) to see that they are real people. Their goal is to develop personal connection with the Zappos brand.Zappos.com even offer Twitter training for their employees, and to date, 430 employees are on Twitter including Hseih. The company also has a special subdomain in their site twitter.zappos.com that covers the posts of all their employees, public mention of Zappos.com, collection of pics posted on Twitter (twitpics), and even their tutorial/quick start guide on using Twitter.
Pete Cashmore, founder of the social media guide website Mashable.com, is ranked #27 in Twitter with 670,666 followers. All of his Tweets are Mashable posts. But instead of just repeating the title of the article, he often adds his unique spin to drive more interests -- and clicks -- to his posts.Cashmore's tweeting is also a good example of how to brand your business in Twitter. He uses the business name "mashable" as his Twitter name, while giving his business personality and face by putting his name and photo in his bio page.
213. Small Business Scorer
Starting a small business with a focus and plan of action are the purpose of a business plan. SCORE Counselors to America’s Small Business and The Company have released a new version of How to Really Start Your Own Business. This popular free workbook is available at SCORE’s 370 offices nationwide. More than 500,000 entrepreneurs have used this workbook as the first step in setting up their businesses.
The business planning workbook describes how to move from a business idea to a business plan. This interactive workbook provides simple exercises to think through planning, choosing a business structure, financing, and start-up operations. Entrepreneurs can get the latest information on how to launch a Web site, attract visitors, and avoid e-commerce mistakes.
SCORE CEO Ken Yancey says, How to Really Start Your Own Business provides valuable information that can help start-up entrepreneurs achieve their business goals and thrive as we move through the recession to better economic times, even during a down economy. Yancey adds, The Company Corporation has been a long-standing SCORE supporter and sponsor of educational resources for small businesses. You can get your workbook and meet for free mentoring sessions at a SCORE office near you.
We are proud to support SCORE in its efforts to educate America’s entrepreneurs about how to start and run a successful business, says Brett Davis, General Manager of The Company Corporation. This how-to workbook is another example of how The Company Corporation is making it easier for small business owners to follow their dreams.
214. Women Business Success
SCORE Counselors to America's Small Business won the prestigious 2009 Interactive Media Award (IMA) for Outstanding Achievement for the SCORE Women’s Success Blog, SCORE received high marks for design, content, functionality and usability. The blog excelled in all areas of the judging criteria and represents a high standard of planning, execution and overall professionalism.
This popular blog is ranked in the top one percent of blogs worldwide, according to Technorati. The SCORE Women’s Success Blog features: Julie Brander, chapter chair with New Haven SCORE; Peg Corwin, marketing chair with Chicago SCORE; Peggy Duncan, volunteer and productivity expert with Atlanta SCORE; Betty Otte, district director and franchise expert with Orange County SCORE in Calif.; and Christine Banning, vice president of marketing and communications with the SCORE Association in Washington, D.C.
Blog posts cover a range of topics from business planning and marketing to access to cash. Recent posts on social media, high tech tools and spring contests for entrepreneurs bring new and current info to entrepreneurs. Bloggers share advice and wisdom on business success and how mentoring can help entrepreneurs build on their strengths.
Last year, SCORE launched the SCORE Women’s Success Blog by women for women as a resource to offer insights, advice and fresh takes on issues facing women entrepreneurs across America. This is the first blog from SCORE at the national level. SCORE recently launched a second blog.
215. Data entry jobs
As per above chart minimum you will get jobs per day, if you get more you can earn more and also we will provide bonus points as per your performanc Taking data entry jobs in your spare time is a great way to make some extra money. Our members participate in hundreds of FREE MONEY making data entry job opportunities without any investment - Join immediately and earn money from the next minute.
We need more number of data entry operators to complete our huge projects. So if you refer more persons to our team, we will provide you of your referral earnings. Apart from your regular data entry earnings it is additional earning opportunity for you. Referral tools and methods are clearly explained after you became member of this website. By joining today, you will get free as joining bonus. Many companies are asking deposit for providing Data entry jobs. But we are providing joining bonus for our members because we have huge quantity of projects and we need more number of data entry operators to complete our projects.
When will you get the payment and what is the mode of payment? You will be paid on or around 8th of every month for the previous month earnings by check
Software jobs
Softwares House is one of the leading global providers of Information Technology services and business solutions. Softwares House is a Master and a thought leader in the field of software development.
Job involves testing, installing, configuring our software and providing training and support to the users of the same on which the incumbant would be trained on.
An opportunity for PHP developers having experience level between years.Job location is Pune.Skill Set required is PHP,My SQL,Ajax,Joomla,Drupal,Smarty
Well versed and experienced in PHP MySQL. Capable of developing database driven web applications. Minimum one year of experience quality projects. Prefer candidates who can join immediately (we are based in Bangalore
216. nursing informatics job
With leading specialists in every field of medicine, the advances pioneered at NewYork-Presbyterian Hospital have improved the lives of people everywhere. Uniting the power of two renowned medical centers – Columbia University Medical Center and Weill Cornell Medical Center – we deliver the highest level of inpatient, ambulatory and preventative care
Under the direction of Children’s Quality and in collaboration with the Chiefs of Pediatric Critical Care, the system administrator will coordinate the VPICU system used for our Pediatric Intensive Care Unit. This system is used to produce statistical reports and related studies relevant to Quality Assurance issues in the PICU. This role includes some supervisory responsibilities.
We seek a Registered Nurse with a strong informatics background and excellent instructional skills for this outstanding opportunity; pediatric critical care experience highly preferred. In this role, you will oversee the accuracy and timeliness of all PICU data, address and resolve problems with manual and automated systems in the PICU, and prepare reports and statistics concerning processes, outcomes, utilization and efficiency. You will coordinate all system upgrades and changes with NACHRI/VPS, a nonprofit initiative formed to improve critical care quality and outcomes for all children and their families through collaborative high quality data management, as well as actively participate in NACHRI User Groups. You will perform orientation and ongoing training of staff in the usage and upgrades of the VPICU information system, and serve as liaison with IS vendors, MIS department and other departments regarding any system issues.
The ideal candidate must have a Bachelor's Degree from a recognized college or university, preferably in Information Systems; equivalent MIS experience accepted. Registered Nurse (preferably with pediatric critical care experience) highly preferred. Must have exceptional organizational and training skills, along with extensive PC/database skills.
217. Automotive engineer
Automotive engineering is concerned with the design, development and production of vehicles and their component parts. Automotive engineers may specialise in a wide variety of areas including powertrain (body, chassis and engine systems), electronics and control systems, fuel technology and emissions, fluid mechanics, aerodynamics and thermodynamics.
Alongside an excellent knowledge of the engineering principles relating to their field, automotive engineers must understand and be able to use a range of new technologies in order to keep pace within a fast-moving and forward-thinking industry
Automotive engineers usually work in multidisciplinary teams to develop land-based vehicles. Their roles combine engineering expertise with management/leadership skills and their work directly influences a company's competitive edge and hence its profitability.
Exact responsibilities depend on the particular area of specialism chosen (powertrain, fuel systems, etc.) and which of the three main stages of development an engineer works to support: design, research and development, or production. However, typical responsibilities include:
218. Energy manager
An energy manager is responsible for improving the energy efficiency of large organisations and domestic properties. They are often required to act as agents of change within their organisation, coordinating all aspects of energy management, from energy efficiency and reduction of carbon dioxide emissions to waste management and sustainable development by.
Changes in building regulations and an increase in legislation and European directives on emissions and efficiency have increased the need for organisations to develop carbon management and sustainability strategies.
Duties vary according to the setting in which the work is being carried out and may range from researching new developments and managing a range of strategies, to providing expertise to individuals.
developing, coordinating, and implementing the aims and objectives of strategies and policies to reduce energy consumption, e.g. EU directives on energy performance and emissions, and monitoring and reviewing the effectiveness of these policies and strategies, including coordinating annual progress reports.
219. Business Development
Chevron business development professionals work on small cross-functional teams to identify and develop new, large-scale upstream business opportunities worldwide, with a focus on Russia, the Middle East and North Africa. Cheron business development professionals focus on global business opportunities that help Chevron grow and expand our reach to provide the energy the world needs.
Chevron business development teams are located around the globe. Depending on your role, you may work at one or more of the following locations
Engineering A Chevron engineering career offers the opportunity to put your expertise and passion for solving problems to work in a stimulating environment where you'll have access to leading-edge technology and tools.
You'll work with other talented engineers around the world to help create new and innovative energy solutions to power the world.
As a Chevron engineer, you'll likely work for several different Chevron operating companies during your career. You may also have the opportunity to travel to Chevron locations around the world.
220. Drilling Engineering
Chevron drilling engineers supervise safe drilling, completion and workover operations at Chevron’s rigs to ensure that drilling operations are safe, environmentally conscious and cost-efficient.
You’ll apply your expert knowledge and skills and receive technical training in the many aspects of safe and environmentally friendly drilling and completion operations.
Your first drilling engineering assignments will be field-based (working at a rig site) on a rotating schedule for about two years. You’ll then supervise more complex operations at remote sites. You may also have the opportunity to support well design, planning and rig operations
Well design in support of a field development. Estimating costs and risk. Reporting and optimization. Application of technology and innovation in directional drilling, mud systems, casing and drill string design and completions.
221. Geotechnical engineer
A geotechnical engineer applies the sciences of foundation engineering, soil and rock mechanics, engineering geology, and other disciplines to civil engineering design and construction, whilst working to preserve and protect the physical environment.
Geotechnical engineers use the skills of analysis to provide highly accurate calculations. Working in a team alongside geological engineers and hydrogeologists, they focus on providing information for and solutions to a specific client's project.
They may assess materials used to ensure the stability of structures such as dams, tunnels or airport runways. They can also be involved in the design and analysis of structures such as bridges, towers and bulidings.
Geotechnical specialists may hold an Institution of Civil Engineers (ICE) accredited civil engineering degree, MSc or BSc in geotechnical engineering, engineering geology, ground engineering or a related, non-accredited degree in a subject such as geology.
222. Electronics Jobs
With the phenomenal growth of the software industry in India, the hardware business is also getting a boost and creating electronics jobs in India in large numbers. Though hardware companies are way behind to match the huge software export volumes, they are experiencing an all time high in the domestic market. According to NASSCOM, the Indian hardware industry have reached US billion in the FY
Major share of the domestic hardware market comes from the hardware systems, peripherals, networking and equipment sale. With this growth of the hardware market there is a surge of electronics jobs in India. Hardware manufacturing companies and maintenance service providers are offering lucrative electronics jobs in India.
We, at Alp Management Consultants are providing effective human resource service to the IT sector. Our service will help you to drastically bring down the turn around time, effort and cost involved in HR & recruitment process. We have different recruitment and staffing solution that will help you manage your core business more effectively. Following are some of the HR services that we offer,
So, if you are looking forward to hire the best people from the Indian job market for your manufacturing unit or maintenance jobs, register with us. As a professional HR firm, we provide a common platform for the employers and the employees. If you are a competent candidate looking for the best hardware jobs India, we can help you get one. Enroll with us and we will send you the hottest offers for electronics jobs in India with regular updates.
223. Automotive engineer
Automotive engineering is concerned with the design, development and production of vehicles and their component parts. Automotive engineers may specialise in a wide variety of areas including powertrain (body, chassis and engine systems), electronics and control systems
Alongside an excellent knowledge of the engineering principles relating to their field, automotive engineers must understand and be able to use a range of new technologies in order to keep pace within a fast-moving and forward-thinking industry.
Automotive engineers usually work in multidisciplinary teams to develop land-based vehicles. Their roles combine engineering expertise with management/leadership skills and their work directly influences a company's competitive edge and hence its profitability.
Exact responsibilities depend on the particular area of specialism chosen (powertrain, fuel systems, etc.) and which of the three main stages of development an engineer works to support: design, research and development, or production. However, typical responsibilities include
224. Audiologist
Audiologists are healthcare scientists who assess, diagnose and rehabilitate patients with hearing, balance and tinnitus problems.
Their job activities are similar to audiological scientists although they do not generally get involved with the research, development and management aspects of the role.
For some activities, audiologists act in an assistant role to audiological scientists, e.g. when administering hearing tests to babies and balance disorders (these tests typically require two people to perform). They may work in an audiology department alongside audiological scientists or in departments staffed by audiologists only.
Most audiologists are employed by the NHS (see NHS Careers ), although there are opportunities in private hospitals. Locum work is also often available, usually via agencies. There are opportunities in the private sector, working for companies such as Boots and Specsavers dispensing hearing aids (some degree courses include registration with the Hearing Aid Counci
225. Aromatherapist
Aromatherapy is the systematic use of the therapeutic components of aromatic plants, known as essential oils, in holistic treatments for healing a wide range of physica.
psychological and emotional disorders. Aromatherapy also strives to improve general vitality and prevent disease.
The Aromatherapy Council , the leady body for aromatherapy in the UK, and Skills for Health , the Sector Skills Council (SSC) for the UK health sector, along with other key organisations including the General Regulatory Council for Complementary Therapies (GRCCT)
he Complementary and Natural Healthcare Council (CNHC) have developed national occupational standards for aromatherapy. The Aromatherapy Council puts forth details of core curriculum qualifications and professional associations that offer courses that meet their standards.
226. Animator
An animator produces multiple images called frames. When sequenced together rapidly, these frames create an illusion of movement known as animation.
Animators tend to work in 2D animation, 3D model making animation, or computer-generated animation. Computer-generated animation features strongly in motion pictures (to create special effects or an animated film in its own right), as well as in aspects of television work, the internet and the computer games industry
The basic skill of animation still relies heavily on the animator's artistic ability, but there is a growing need for animators to be familiar with technical computer packages.
Much of the work involves pitching and being proactive in selling your ideas and work to prospective customers and clients. This applies across the board, whether you are self-employed, working freelance or employed within a business.
227. Airline pilot
Airline pilots fly passengers and cargo on a national and international basis for business, commercial and leisure purposes. The aircraft is usually operated by two, three or four pilots, depending on the type of aircraft and length of the journey
The captain is the pilot-in-command, and has the overall responsibility for the safe and efficient operation of the aircraft, including its crew
Prior to the flight, pilots study flight plans and ensure that the aircraft's controls are operating efficiently. They are also responsible for checking the weather conditions and liaising with air traffic control.
Pilots are employed in a number of different areas: passenger scheduled services, passenger charter services, freight services and business aviation (general aviation).
228. Air traffic controller
Air traffic controllers maintain the safe and orderly movement of aircraft along major air routes and around airports by giving pilots instructions and advice as to height, speed and course.
The majority of controllers work at control centres as area controllers responsible for keeping aircraft flying the airways. Others work as approach controllers dealing with aircraft movement into and out of an airport, or aerodrome controllers guiding aircraft through landing and to the terminal
Over 2.4 million freight and passenger flights take off or land in the UK, or pass through UK airspace, every year. Air traffic controllers usually have a number of aircraft under their control at any one time.
Approach controllers are based at a control centre or an airport tower. They guide and sequence aircraft into the most efficient order for landing. This includes dealing with instrument landing systems, which allow some planes to make automatic landings, and making sure that planes are placed in holding patterns when airports are busy.
229. Air cabin crew
Air cabin crew are responsible for the safety of passengers and are specially trained to deal with security and emergency situations
Besides ensuring the safety of the aircraft, air cabin crew are also employed to ensure passengers are comfortable and that the flying experience is a pleasant one.
Air cabin crew attend to passengers' needs throughout the flight and provide a high level of customer service, serving refreshments and selling duty-free goods. They are expected to be friendly, enthusiastic and courteous at all times.
The work of air cabin crew may be stressful and demanding, but it is also a varied, interesting and rewarding role.
230. Advertising account planner
Account planners play a key part in developing advertising campaigns for a huge range of products and services. The planner is responsible for writing the formal creative brief and for providing the ideal environment for creative development.
Acting as the voice of the consumer within an agency, a planner uses research data to identify ideal audiences and optimum methods of communication.
Planners combine market data, qualitative research and product knowledge within a core proposition to enable the creative team to produce advertising ideas that resolve defined business problems. With increasing public awareness of marketing strategies, a key challenge is to develop innovative ways to reach consumers.
Gaining a comprehensive context for advertising strategies by analysing a wide range of information in great detail, including demographics, socio-economics and the market for the client's product and market share.
231. Accounting technician
Accounting technicians work in all areas of finance. They usually start working in a support role within a firm of accountants, or in the accounts or finance departments of organisations in industry, commerce or the public sector.
However, there are opportunities for progression with experience. Those with more experience may become self-employed, providing a variety of accountancy and taxation services to a range of small to medium sized businesses.
In many larger organisations, accounting technicians work alongside members of chartered accountancy bodies. In smaller organisations, they may be the only financially trained member of staff.
Different financial sectors require specialist knowledge. This may determine which areas a technician chooses to specialise in. More senior positions include finance manager, budget controller and internal auditor, with each role requiring particular knowledge and experience.
232. Acupuncturist
An acupuncturist is a complementary health practitioner who takes a holistic approach to the maintenance of health and the management of disease with a focus on improving overall wellbeing
Acupuncture is an Ancient Chinese holistic therapy based on the theory that the body depends on life energy, known as Qi, being in balance. Acupuncturists correct any imbalances through inserting fine needles into acupuncture points, thus maintaining or restoring good health and wellbeing
Acupuncturists conduct one-to-one consultations with their patients, using their skills and knowledge to treat a wide range of health problems.
The first consultation may last up to an hour and a half to allow the acupuncturist to take a detailed case history before making a diagnosis and beginning treatment. Subsequent sessions may take 45 minutes to an hour. The duration of the treatment programme varies depending on the severity of the problem. Some patients may require only a few sessions, while others may need 20 or more.
233. Amenity horticulturist
Amenity horticulture covers the design, construction, management and maintenance of living, recreational and leisure areas. These include: country parks; botanic and public gardens; sports facilities; urban tree planting; historic gardens and landscapes; cemeteries and crematoria; and other public spaces.
An amenity horticulturist may be involved in all stages of design, growing and maintenance. The work is increasingly complex, requiring management and technological competence alongside scientific understanding and the traditional skills of cultivation.
It also requires acceptance and understanding of the important contribution that horticulture can make to conserving the environment and improving quality of life. Amenity horticulturists may also work in education or the media.
An interesting future development, certainly in gardens open to the public, could be the use of RIFID tags attached to plants and trees. Electronic readers will read the tag and relay the story of the plant or tree to the person holding the reader.
234. Financial trader
There are three types of trader: proprietary, flow (market makers), and sales. Flow traders buy and sell products on the financial markets for the bank's clients. Products include securities and other assets such as futures, options and commodities. They make prices and execute trades, seeking to maximise assets or minimise financial risk. Proprietary traders trade on behalf of the bank itself. Their aim is to buy low and sell high.
They do this by analysing economic data, technical analysis, experience, cross-asset correlations and identifying undervalued and overvalued prices. Sales traders deal directly with clients, providing market information, execution and promoting new financial ideas to clients. They are intermediaries between the client and the market maker.
Whilst there are many similarities in the work of flow and proprietary traders and those working in sales, their roles differ substantially. The main difference is risk - sales traders don't take risk while flow/prop traders take risks seeking reward.
Flow and proprietary traders focus on executing trades at the right price. Traders sit at workstations in a dealing room, tracking market movements. Markets can move rapidly and trading can be hectic. The role combines speaking with colleagues, making phone calls and making instant decisions. Traders in this area must be alert and ready to make decisions based on the smallest movements in the market.
They react to a change in parameters/constituents that is not already implied by the current market price. The price should reflect the intrinsic value of the asset, which can change at any second for multiple reasons.Their decisions are informed by in-depth market reports provided by their firm's investment analysts and by sales traders, as well as streamed market news from agencies such as Bloomberg and Reuters.
Traders also use their own technical analysis. Much of the job is based on independent thinking. Independent thought, especially in proprietary trading, adds value to any team. During the first year the trainee performs relatively menial tasks such as data analysis and administrative duties before being trusted to be responsible for the firm's money.
235. Banker
A banker is responsible for establishing and maintaining positive customer relationships, planning and delivering effective sales strategies and monitoring the progress of new and existing financial products.
Bankers may work as managers in high street branches, providing operational support on a day-to-day basis, or in more specialised posts in corporate or commercial departments at area, regional or head offices
Banks operate in a fiercely competitive marketplace where change is common. Products and services must develop to satisfy the expectations and demands of customers. Working with staff and customers to achieve targets has become a major part of the role.
Responsibilities and work activities may vary between retail and corporate and commercial banking. Most retail bankers work in high street branches, dealing with both private and corporate customers, while some work in regional or head offices. Bankers who work with commercial or corporate customers may be based in branches or may work from specialised area or regional offices.
Bankers with area and regional responsibilities adopt a strategic role and, while retaining overall accountability for service and product delivery, often delegate supervision of day-to-day operations to staff in branch outlets.
236. Barrister
Barristers (in England and Wales) are specialists in advocacy, representing individuals or organisations in court, under instruction from a solicitor or another designated professional. They also give advice to their professional clients, who are usually solicitors. Barristers have rights of audience in all courts.
Barristers usually specialise in particular areas of law such as criminal law, chancery law, commercial law, and common law, which includes family, housing and personal injury law. Most barristers work on a self-employed basis, from chambers. An increasing number of employed barristers work in private and public organisations.
the work of a criminal barrister is likely to involve a lot of advocacy in court;
a family law barrister may be representing clients in court in a contact dispute or divorce case, but may also be involved in mediation as a way of avoiding the need to go to court
Employed barristers undertake similar activities for one company or client. At more senior levels, they may also become involved with the development of legal policy and strategy.
237. Biochemical engineer
Biochemical engineers apply engineering science principles to biological materials, processes and systems to create new products. These may include almost anything - vaccines, foods, plastic forks and plates, cattle feed, clothing, soda pop sweeteners - the list is endless.
The processes biochemical engineers work on may dramatically improve our lives. They are involved, for example, in: making 'magic bullets' that locate and kill cancerous tumours; developing and producing pharmaceuticals to reduce heart disease; and synthesising high-performance lubricants which last a car's lifetime. They also develop processes to reduce pollution or treat waste products
Biochemical engineers may take on managerial responsibility for projects, or specialise in particular processes or techniques. An increasing number of graduates are choosing to join small start-up companies working on new technologies.
238. Bilingual secretary
A bilingual secretary combines language and administrative skills to interpret, translate and summarise information in order to ensure effective and efficient communication on a global level. The extent to which language skills are used on a daily basis varies according to the employer.
The most common languages utilised are Spanish, French and German although there are growing opportunities in other international languages, such as Chinese, Arabic and Japanese. Opportunities for bilingual secretaries
Exist in international organisations, foreign and British banks, professional consultancies (e.g. management, legal and insurance), manufacturing companies and other industrial organisations that operate in the UK and overseas.
239. Government research officer
Government research officers work within a wide range of government departments and bodies. They liaise closely with civil servants and other government analysts, such as operational researchers, economists and statisticians.
Their role is to provide research input for the analysis required for the development, implementation, review and evaluation of new and existing policies. Ultimately, this research evidence helps inform the policy decisions of ministers.
Government social research officers account for over 1,000 members of the workforce, spread over 20 government departments. They are responsible for the research and analysis of policy, as well as commissioning and managing research.
Their challenging, fast-moving and diverse role has a direct impact on many government activities, often at a high-profile level.Job activities vary significantly according to department and policy area and whether research is conducted in-house or commissioned from external researchers.
240. Brewing engineer
Brewing engineers provide the engineering expertise to modify, design, install and maintain brewing, beer processing and high-speed packaging plants. The work also involves energy management and managing services, including water, steam and electricity.
There tend to be more employment opportunities in larger companies or specialist brewing engineering consultancies. There are opportunities for brewing engineers with experience to move into a more specialist role, such as safety and risk management, or into commercial areas. Many brewing engineers also move into other similar fields, such as process engineering
managing a team of specialist technicians engaged in the provision of site services and plant maintenance to achieve and/or improve production targets;
249. Broadcast engineer
A broadcast engineer operates, maintains, updates and repairs hardware and systems used across TV, radio, podcasts and other channels, ensuring that programmes are broadcast on time to the highest possible level of quality.
Broadcast engineers work in a team with others including producers, studio managers and presenters as well as other technical staff.
As well as being spread across ever-expanding methods of distribution, the work of broadcast engineers takes place in a range of locations and work situations. In addition to studio work, there may be outside broadcasts, when sound and images are relayed live back to a studio or straight to the network.
minimising loss of service at times of equipment failure by rapidly identifying and implementing alternative methods of service provision;
250. Broadcast journalist
Broadcast journalists are responsible for investigating, gathering and reporting on news and current affairs. They are expected to present this information in a fair, balanced and accurate way through news bulletins, documentaries and other factual programmes for radio, television and online broadcast.
Skillset (Sector Skills Council for the Audio Visual Industries) defines broadcast journalism as 'the collection, verification and analysis of events which affect people'. The work of a broadcast journalist shapes people's perceptions of the world in which they live and therefore has a far-reaching impact.
Broadcast journalists can fill a number of roles within the media including editor, reporter, presenter/news anchor, producer and correspondent.
Although exact duties and responsibilities will vary from role to role and between radio, television and the internet, broadcast journalists will generally be involved in many of the following duties, on a daily basis
251. Broadcast presenter
Broadcast presenters are the public face - or voice - of broadcast shows in television and radio. They work across the entire area of broadcasting: regional, national, satellite and cable television or radio
The exact nature of the job varies according to whether the programme is essentially about, for example, news, current affairs, sport, or music, or whether it is a chat show, game show, etc.
However, the general principles remain the same: introducing and hosting the programme and creating links between items; often introducing and interviewing guests, and interacting with the audience
Radio presenters on music shows usually 'drive' the desk and operate some of the technical equipment for recording and playback. This generally involves using computers, often with touch screens, to cue up and play music and jingles.
252. Biomedical engineer
Biomedical engineers apply engineering principles and materials technology to healthcare. This can include researching, designing and developing medical products, such as joint replacements or robotic surgical instruments; designing or modifying equipment for clients with special needs in a rehabilitation setting; or managing the use of clinical equipment in hospitals and the community.
Biomedical engineers can be employed by health services, medical equipment manufacturers and research departments/institutes.
Job titles can vary depending on the exact nature of the work. As well as biomedical engineer you are likely to come across bioengineer; design engineer; and clinical scientist (in a hospital setting/clinical situation).
using computer software and mathematical models to design, develop and test new materials, devices and equipment. This can involve programming electronics; building and evaluating prototypes; troubleshooting problems; and rethinking the design until it works correctly;
253. Building surveyor
Building surveyors provide professional advice on all aspects of property and construction. They work on site with new buildings and are concerned with the aftercare and performance of existing buildings. This is a very wide field and may include advising on various aspects of buildings at different stages, including
The nature of the work may range from the design of large, multimillion-pound structures to modest adaptations and repairs, and sometimes includes working with buildings of architectural or historic importance.
Building surveyors may be called upon to give evidence in court in cases where building regulations have been breached and as expert witnesses on building defects and dilapidations.
254. Building project manager
Building project managers have overall responsibility for the planning, management, coordination and financial control of a construction project.
It is their responsibility to see that the clients' wishes are adhered to and that the project is completed on time within the budget agreed. The project manager may be involved from the initial conception and design of the project,
Building project managers are likely to work on more than one project at a time.
Traditionally, this was a role that graduates might move on to after a few years' experience in another position, such as site engineer or construction manager. However, the industry is changing, in particular in its approach to the construction process (formerly more adversarial, but now increasingly based around the concept of partnership). This means that the future of this role is as a graduate entry position.administration, work machinery, organise associated businesses, and manage staff. They need to have technical and practical competence, coupled with the ability to make sound business decisions.
Farms are generally arable, dairy or livestock, run by farm management companies or single-owner farmers. Crops range from wheat, barley and rye to sugar beet and linseed. Livestock are usually pigs, cows or sheep.
must appreciate the need to satisfy regulations set by the Department for Environment, Food and Rural Affairs (DEFRA) , for safe, high-quality produce farmed in an environmentally sustainable manner.
responsible for planning, organising and managing the activities of a farm to meet the objectives of the owner. Typical work activities include
255. Facilities manager
Facilities managers are responsible for the management of services and processes that support the core business of an organisation. They ensure that an organisation has the most suitable working environment for its employees and their activities. Duties vary with the nature of the organisation, but facilities managers generally focus on using best business practice to improve efficiency, by reducing operating costs whilst increasing productivity.
This is a wide field with a diverse range of specialisms and responsibilities in different combinations, depending on organisational structure. Facilities managers are involved in both strategic planning and day-to-day operations, particularly in relation to buildings and premises. Likely areas of responsibility include
Facilities managers are employed in all sectors and industries and the diversity of the work may be reflected in different job titles such as operations, estates, technical services, asset or property manager. Responsibilities are often broad, covering several departments, as well as central services that link to all the teams in the organisation.
In smaller companies, duties may include more practical and hands-on tasks. Many facilities management professionals are employed on a consultancy basis, contracted to manage some or all of these activities by a client organisation.
256. Industrial buyer
Purchasing and procurement is an important aspect of successful business performance in a wide range of sectors.
Industrial buyers are responsible for ensuring that their companies select the most appropriate goods and services on the basis of price, quality, delivery times and services support. They must also take their own business brand needs and customer interests into consideration.
Industrial buyers develop the best buying strategies for their organisation with a view to maintaining consistent quality and managing costs. In an increasingly competitive marketplace, with growing public awareness of corporate social responsibility, buyers have a key role in industries as diverse as telecommunications, financial services and manufacturing.
The work activities carried out by an industrial buyer depend largely on the business functions, size and location of the employing organisation. Duties will also be influenced by seniority
257. Barrister's clerk
A barrister's clerk is responsible for running the business activities and administration of a barrister's chambers. The role is integral to the success of a set of chambers as a business and as a practice.
Barristers' clerks must be familiar with court procedures and etiquette. They will also develop an expertise in the type of law undertaken by their chambers.
This demanding but rewarding role requires a combination of commercial acumen, legal knowledge and strong interpersonal skills. The term 'clerk' is historical and does not accurately reflect the co-ordination of workload, marketing and financial management undertaken.
planning a case in detail to take into account factors such as conferences, preparation time and estimated number of days in court, then arranging meetings with the instructing solicitor, client and barrister to discuss the case.
258. Careers adviser/personal adviser
A careers adviser provides information, advice and guidance to help people make realistic choices about education, training and work. Careers advisers work with a range of clients aged 14 to adult.
In England the usual entry point for careers advisers is as a personal adviser (PA) with a Connexions or careers company local authority partnership. PAs work specifically with young people aged or those with learning difficulties or disabilities up to the age of. They offer guidance on a variety of issues including education, careers, relationships, health, housing and money
he work of a personal adviser (PA) will vary according to how their employing Connexions or careers company local authority partnership is structured. Most will have a mixed caseload of education and community work. Some will have a caseload with a specific emphasis, e.g. working with young people who are not in education, employment or training. Intensive PAs work in an in-depth way with a smaller number of clients.
keeping up to date with labour market information, legislation, and professional and academic developments by visiting employers, training providers and training events run by educational and professional bodies. providing information, advice and guidance about a range of issues, such as careers, education, employment and training, housing, money, health, drugs and bullying; either directly or in partnership with specialist agencies
259. Careers consultant
A careers consultant provides support on all aspects of career management and development, using guidance, counselling, coaching and advisory techniques to assist clients to clarify and achieve career and work goals.
Many careers consultants work within an organisation, guiding and advising employees, often individually but also in groups. A large number of careers consultants work on a freelance basis with individual fee-paying clients in a private setting.
The role overlaps with that of Human resources officer and Occupational psychologist, as well as Careers adviser/personal adviser. Careers consultants should not be confused with Recruitment consultants, who look for suitable candidates to fill their clients' vacancies
conducting one-to-one consultations with clients, most commonly face-to-face, but also more frequently via telephone, Skype or email - usually in-depth initial consultations lasting about an hour and involving some form of work history analysis, with a number of similar follow-up consultations thereafter.
260. Careers information officer
Careers information officers identify, analyse and assess the suitability and value of information relevant to a careers service or careers information unit.
They develop strategies for information planning, procurement, provision and management to meet current and anticipated needs, while considering budgeting constraints
Often providing a service both internally to service users, such as current and former students and work colleagues, and externally to other organisations and clients, they may also be responsible for other areas within the service, not necessarily information-related
Individual tasks tend to vary between post holders, depending on the size and the type of the employer. However, there is common ground and typical activities will usually include organising, classifying, maintaining and storing information, often using computer applications for access and retrieval.
261.Cartographer
A cartographer is involved with the scientific, technological and artistic aspects of developing and producing maps and cartographic information. Cartographers present complex information as diagrams, charts and spreadsheets, as well as in the form of conventional maps. Geographical information systems (GIS) and digital-mapping techniques now dominate the role.
Maps and detailed geographical information are needed for a range of purposes, from everyday use by individuals to large-scale industrial development.
Cartographers work within a variety of areas, including publishing, government, surveying and conservation. The role varies widely from the development and design of geographical information to more strategic and technical work.
A cartographer's role can vary widely; from the technical role of the development, maintenance and manipulation of cartographic databases, to the promotion of effective and efficient visualisation of geospatial information, to the design of bespoke maps.
262. Catering manager
Catering managers plan, organise and develop the food and beverage services of organisations and businesses, whilst meeting customer expectations, food and hygiene standards and financial targets.
There is a wide range of jobs in catering management, along with a number of different routes into the industry. Roles include: managing restaurants, bars and other outlets in hotels, resorts or cruise liners; providing catering services at events; and running catering operations in organisations such as hospitals, schools and higher education institutions.
The role varies according to the size and nature of the establishment. In a small operation, the catering manager has more of a 'hands on' role and will be involved in the day-to-day running of the operation
while in a larger organisation, the catering manager might have other managers and supervisors to handle different functions. In contract catering, the catering manager will spend time negotiating with the client organisation, assessing its requirements and ensuring that it is satisfied with the service delivered.
263. Furniture designer
Furniture designers produce designs for items of furniture and related products. Designs may be for mass production, in small batches, or as one-offs. Designers may be involved in the design aspect of the work alone, or they may be designer/makers, producing items from their own designs. Designers work alone or alongside colleagues creating concepts and designs that balance innovative design, functional requirements and aesthetic appeal.
The process of furniture design demands creativity, business awareness and skills in marketing, finance, sales and manufacturing. The role may involve a number of functions, particularly for the self-employed, including designer, production manager, buyer, salesperson, accountant, and maintenance engineer.
Work activities vary according to whether you are a self-employed furniture designer working alone or with one or two other like-minded craftspeople, or whether you are employed by a manufacturing company with a group of experienced furniture designers.
A furniture manufacturing company may employ salaried designers to create innovative furniture designs, whereas self-employed furniture designers need to engage in self-promotion by advertising their services or attending furnishing fairs and exhibitions.
264. Chartered accountant
Chartered accountants are responsible for financial reporting, taxation, auditing, forensic accountancy, corporate finance and insolvency.
They play a strategic role by providing professional advice, aiming to maximise profitability on behalf of their client or employer. They work in many different settings, including public practice firms, industry, commerce and the public sector.
In public practice firms, chartered accountants provide professional services to fee paying clients, from private individuals to large commercial and public sector organisations.
In commerce, industry and the public sector, they may work in fund management and procurement, as well as in financial management and reporting roles.
265. Chemical development engineer
A chemical development engineer creates and develops industrial processes and plant to make the products on which modern society depends. These products include fuels and energy, food and drink, clean water, artificial fibres, pharmaceuticals, chemicals, plastics and toiletries.
They may focus on one or more of the following: researching new products from trial through to commercialisation; managing scale-up processes from pilot plant to full industrial-scale manufacturing; improving product lines; developing and modifying the manufacturing and processing plant that produces the products; and designing and commissioning new plants.
Protecting the environment and safety are significant concerns for chemical development engineers. working closely with process chemists and control engineers to ensure the process plant is set up to provide maximum output levels and efficient running of the production facility.
undertaking small and intermediate-scale manufacturing and packaging activities in pharmaceutical product development for clinical trial purposes.
266. Chiropractor
Chiropractors are primary healthcare professionals concerned with the diagnosis, treatment and prevention of mechanical disorders of the musculoskeletal system and the effect of these disorders on the functioning of the nervous system and general health.
Treatment involves using the hands to apply a specific force to adjust the joints of the body, concentrating particularly on the spine. Treatment may also involve working on muscles. Chiropractors treat chronic and acute conditions.
The profession takes a holistic approach to the needs of patients, considering physical, psychological and social factors, and recognises the value of working with other healthcare practitioners.
The majority of a chiropractor's time is spent in individual consultations. These consultations will involve diagnosis and treatment. taking detailed medical histories, including information on previous injuries, surgery, general health and lifestyle
267. Choreographer
A choreographer works with dancers to interpret and develop ideas and transform them into the finished performance. This might mean taking overall control of a production, or working under an artistic director in a dance company, or the director of an opera, play or musical. Opportunities also exist to work in films or with pop groups.
Choreographers work in a variety of ways. Some will work freelance, while others are attached to a company. Some choreographers combine the job with writing or teaching, while others will set up their own company.
Almost all choreographers begin their careers as dancers and usually start choreographing while still performing, especially in smaller companies. Choreographers frequently absorb artistic influences from other art forms, such as theatre, the visual arts and architecture.
A number of dance schools, such as the Laban Centre for Movement and Dance , run courses in which choreography is a core area. The Council for Dance Education & Training (CDET) also has details of courses in a wide range of institutions. The Place has a choreographic centre for both emerging and established choreographers.
268. Cinema manager
A cinema manager is responsible for the efficient day-to-day control of all cinema activities. In accordance with relevant statutory and company requirements, the role will incorporate a variety of areas including HR responsibilities, training staff and marketing initiatives to ensure optimum profit. Other tasks include liaising with local press and websites to promote film premieres and cinema times.
Managers of smaller cinemas may also host private and/or group screenings for clients. Independent cinemas may promote art and educational films; an independent cinema manager will be looking to offer a wide variety of films, including art house films, to reach sales targets.
Cinema managers, like many other managers in the leisure industry, are responsible for promoting effective visitor services, whilst ensuring business objectives are achieved. Typical work activities include:
Cinema managers may also be responsible for hiring out the venue for business, training or leisure purposes. Managers may also arrange special events, including showing films of a particular genre or theme. Additionally, those based in large cinema chains may be involved in hosting premieres, which will entail:
269. Clerical assistant
Whilst the role of clerical assistant may not be viewed as a typical graduate position, it can be a good way of gaining experience in an office environment without the need for specific secretarial qualifications. A willingness to get involved could help future career development and will provide an opportunity to learn more about working in administration and gain valuable work experience.
In-house training is usually provided. Basic IT skills, such as use of Microsoft packages, email and the internet, are normally required. Formal secretarial qualifications are usually unnecessary at this stage.
The opportunity to mould a less-experienced individual in a clerical role is often looked on favourably by some employers, and it is possible in some industries to pursue roles in HR, or other non-professional positions such as PAs by working 'up the ladder' from a clerical position.
270. Civil service administrator
A civil service administrator will work within one of the civil service departments throughout the UK which, collectively, employ just under half a million people.
Civil service departments work closely with the government to formulate policies. These policies are then delivered by government agencies. Since the function of the departments of the civil service varies greatly, so too will the role of administrator.
You will, however, have the opportunity to contribute proactively to political affairs and, whichever department you are based in, you will work closely with a team to ensure a high quality of services are delivered.
The tasks involved will vary greatly according to the department in which you are based, but generally are likely to include some or all of the following activitiesusing excellent customer service skills to deal with service users, be they other civil servants, members of the public or other organizations
271. Clinical biochemist
Clinical Biochemists carry out complex analytical work. They analyse and interpret data relating to patients' samples to assist with the investigation, diagnosis and treatment of diseases.
Clinical biochemists work with other heath professionals, such as biomedical scientists, to detect changes in the complex biochemistry of body fluids, for example, increases in glucose levels in diabetes mellitus. They develop and implement new techniques; interpret results; liaise with and advise clinical staff; are responsible for the evaluation and quality assessment of diagnostic tests; and play a role in developing and managing hospital and community analytical services.
A typical laboratory processes several thousand samples per day. Of these, a few hundred results will be abnormal and need to be scrutinised by a clinical biochemist. Other work activities include:
carrying out complex biochemical analyses on specimens of body fluids and tissues, using spectrophotometry, mass spectroscopy, high performance chromatography, electrophoresis, immunoassay and, increasingly, molecular biological techniques;
272. Clinical psychologist
Clinical psychologists aim to reduce psychological distress and enhance and promote psychological wellbeing.
They often work in health and social care settings, as part of a multidisciplinary team. They use the methods and findings of psychology and psychological theories with clients to enable them to make positive changes in their lives.
Clinical psychologists work with people of all ages who experience mental or physical health problems. These may include.
Most clinical psychologists work with a particular client group, or in a particular setting, e.g. adult mental health, forensic services, child and family, learning disabilities or older adults
273. Colourist
An understanding and sense of colour are central to the skills and abilities of a fashion/textile designer. Where such ability proves to be an exceptional skill, specialisation as a colourist is an option.
Large manufacturers of fibres and producers of wool, yarns and textiles may employ colourists in the selection of colour ranges of dyes for their products.
Printed textile firms employ colourists to paint colour ranges for their fabric designers. Other examples of areas where colourists may find employment are with design consultants or manufacturers concerned with surface pattern designs, automobile designs and paints.
274. Community arts worker
Community arts workers collaborate with a wide variety of local groups, encouraging the use of artistic activities to support their development and improve their quality of life. Generally, they work in areas where there are social, cultural or environmental issues to be addressed.
Project work may fall into such categories as race, gender, disability, health and the environment, and may focus on the following groups:
Depending on the role, the work varies considerably between the facilitation of creative projects and more administrative responsibilities.
In some cases, the work is mainly artistic and creative in nature, particularly if the role is as an 'artist in residence' for a specific project or initiative, but community arts workers often have a more administrative, strategic and managerial role, particularly those working within local government, or for arts companies, agencies or charities as project officers or coordinators
275. Corporate photographer:
A corporate photographer, or commercial/industrial photographer, works for businesses and organisations, undertaking whatever photographic work that organisation requires. This can include pictures for internal newsletters or published magazines,
portraits of senior staff members for annual reports and other corporate publicity and pictures for the organisation's website. A corporate photographer may also have to provide suitable photographs for external media, e.g. the relevant trade press.
Many corporations find it cheaper to employ specialist freelance commercial photographers than to employ their own staff photographers.
276. Community home/school teacher:
Community home/school teachers link parents with schools through a number of activities. The focus of their work depends on the local authority's policy in this field.
Local authorities may have special initiatives - related to the government’s inclusion policy - that depend on the social needs of a particular geographical area. There may be a home link team set up in an authority to work on a multitude of issues.
The work of community home/school teachers often focuses on early intervention work (from nursery age) and promoting parental involvement in early literacy and numeracy work
277. Completion engineer:
After an oil well has been drilled and the casing run cemented, equipment is installed to allow it to be operated safely, economically and in a controlled manner. The anticipated life of an oil well may be ten to twenty years and consideration has to be given to its characteristics during this period and the type of equipment required. This process is known as well completion.
The completion engineer must have a good understanding of the various types of well and the operations that may have to be performed.
Some require methods of artificial lift, be it through gas or pumping, to lift the fluid from the reservoir to the surface. Some treatment or stimulation may also be required later in the life of the well.
278. Consulting civil engineer:
Civil engineers are involved with the design, development and construction of a huge range of projects in the built and natural environment. Their role is central to ensuring the safe, timely and well-resourced completion of projects in many areas, including highway construction, waste management, coastal development and geotechnical engineering.
Consulting civil engineers liaise with clients to plan, manage, design and supervise the construction of projects. They work in a number of different settings and, with experience, can run projects as a project manager. Civil engineering offers many opportunities as well as the satisfaction of helping to improve and enhance public quality of life in many settings.
Within civil engineering, consulting engineers are the designers whereas contracting engineers turn their plans into reality. Consulting civil engineers provide a wide range of services to clients. During the early stages of a career, work will involve taking responsibility for minor projects, but the size of the projects may increase as experience is gained.
The role is focused on enhancing high-quality flight safety and standards, as well as reducing system costs and, increasingly, the environmental impact of air travel. Aeronautical engineering offers a wide range of roles.
Most engineers specialise in a particular area, such as research, design, testing, manufacture or maintenance. The aerospace industry is well established in the UK, and constant expansion in air travel means that there are many roles available.
Typical work activities vary according to the role and employer, with most aeronautical engineers specialising in a specific area. Within each specialism, tasks typically involve
279. Community health doctor:
Community health doctors are employed by primary care organisations to provide medical services needed to fill any gaps in the service provided by general practitioners (GPs). In many cases, they also offer an alternative for any patient who may not wish to see their GP.
The work is based in a community centre with other health care professionals. The range of services offered can include podiatry, counselling, dentistry, dietetics, physiotherapy and speech therapy, amongst others. They may also run specialist clinics at community centres, such as family planning, child health, diabetes and asthma. If a particular surgery is over-run or short-staffed, community health doctors may also run clinics for that practice.
Individual work with patients within community health can be similar to working in a practice. However, you will not have the same responsibilities as a GP, who has more organisational and administrative responsibilities concerning the running of the practice.
280. General practice doctor
General practitioners (GPs) provide primary and continuing medical care for patients. They take account of physical, psychological and social factors when diagnosing illness and recommending the required treatment. Patients may be referred to hospital clinics for further assessment and/or treatment.
GPs may run specialist clinics within the practice for patients with specific conditions. They work alongside other health care professionals to discuss care options for patients and their families.
GPs who are partners are responsible for the running of the practice. This involves a range of administrative activities, such as employing staff, keeping abreast of paperwork, and managing budgets.
Partners in a practice may decide to expand their career portfolio and specialise in a certain area of medicine, such as obstetrics and gynaecology, psychiatry, orthopaedics, etc. They may also specialise in areas such as IT, human resource management, medical education, or training.
281. Geological mapper
A geological mapper is responsible for surveying and mapping particular areas of land to determine near-surface deposits, rock type and geological structure. This involves collecting, analysing and recording rock, soil and sediment samples.
The work of a geological mapper is predominantly field-based, though it also combines, desk-based research, data analysis, report writing and management tasks.
Also known as field survey geologists, the job can involve working concurrently on a variety of different projects for different clients. Work is highly systematic in nature and high-level analytical skills are paramount. Geological mappers require excellent scientific understanding and detailed knowledge of earth sciences.
communicating complex technical issues and their implications to non-specialists, for example, explaining to local planners how the geology of an area may affect possible routes for a new road.The work also involves liaising and co-ordinating activities with a range of other technical specialists, such as: civil engineers.
282. Drilling engineer:
A drilling engineer develops, plans, costs, schedules and supervises the operations necessary to the process of drilling oil and gas wells, from initial well design to testing, completion and abandonment. Engineers are employed on land, on offshore platforms or on mobile drilling units, either by the operating oil company, a specialist drilling contractor or a service company.
The role can involve administering drilling and service contracts, engineering design and the planning of wells, and supervising the drilling crew on site.Drilling engineers work with other professionals, such as geologists and geoscientists, to monitor drilling progress, oversee safety management and ensure the protection of the environment.Dramatherapists use a variety of interventions with clients, including stories, puppetry, improvisation, drama and movement to allow clients to explore their past experiences and access their unconscious in an indirect way to enable exploration and reflection.
The British Association of Dramatherapists (BADth) defines dramatherapy as: 'focusing on the intentional use of the healing aspects and therapeutic process of drama and theatre; it is a method of working and playing that uses action methods to facilitate creativity, imagination, learning, insight and growth.
283. Electronics engineer
Electronics is the technology associated with electronic circuits and systems, and uses components such as capacitors, diodes, resistors and transistors.
Electronics engineers research, design, develop, test and produce precision components and systems, developing the way electricity is used to control equipment. The work is usually carried out in cross-functional project teams, with colleagues in electronics and other branches of engineering.
Electronics touches on almost all areas of human activity, so its applications are diverse. They include acoustics, defence, medical instruments, mobile phones, nanotechnology, radio and satellite communication and robotics. Subfields of electronics engineering include control engineering, instrumentation, signal processing and telecommunications engineering,
Electronics engineers work on a project through all its stages: from the initial brief for a concept; through the design and development stage; to the testing of one or more prototypes; and through to final manufacture and implementation of a new product or system.
There are two main types of graduate electronic engineer. Chartered Engineers (CEng) have the greatest level of responsibility for engineering projects. They develop solutions to problems using new or existing technologies. Incorporated Engineers (IEng) take responsibility for specific aspects of a project. They maintain and manage applications of current and developing technology.
284. Glass blower/designer
A glass blower/designer is responsible for designing, producing, decorating and finishing pieces of glass ranging from giftware, tableware, exhibition pieces, stained glass windows, mirrors, ornaments and other architectural glass products through to glass equipment used in scientific laboratories.
Most of the work is carried out by small, independent studios, though there are some larger glass manufacturers based in the UK.
The work can be commissioned by individuals, corporate organisations or the public sector. Most blowers/designers will be involved in the whole commission process, from concept to completion
Glass blowers/designers may also be involved in restoring, renovating and repairing original pieces. designing, producing and finishing decorative pieces, including windows, mirrors, lamp bases, ornaments, tableware and sculptures;soldering pieces of coloured, painted or enamelled glass; using decorative techniques, including engraving, acid-etching, stencilling, and sand or grit blasting;restoring, renovating and repairing original pieces;working with molten glass (from a furnace) and a blowing iron to produce an infinite range of shapes/forms;kiln forming – slumping glass into a mould.
285. Community development worker
A community development worker works with particular communities in order to collectively bring about social change and improve quality of life. They work with individuals, families or whole communities to empower them to
Community development workers often act as a link between communities and local government and other statutory bodies. They are frequently involved in addressing inequalities, and projects often target communities perceived to be disadvantaged, for example due to race, economic circumstances or geography
Community development work seeks to engage communities actively in making sense of the issues which affect their lives, setting goals for improvement and taking action through empowerment and participative processes. A good deal of the work is project-based, which means that community development workers usually have a specific geographical community or social group they focus on.
Community work can be generic or specialised. Generic community work takes place in a given geographical area, focusing on working with the community to identify their needs and issues, formulating strategies and developing services to address those issues. The setting is either urban or rural, with rural community development work increasingly attracting attention in recent years. Specialised community work focuses on either specific groups within a region (such as the homeless, the long-term unemployed, families with young children or ethnic minorities) or on particular concerns (such as public transport, mental health or drugs action)
286. Fashion designer
Fashion designers work on the design of clothing. Some may focus completely on a specialist area, such as sportswear, childrenswear or accessories.
The main markets they design for are haute couture, designer ready-to-wear and high street fashion. Developments in technology mean that a designer ready-to-wear product can be produced as a high street version in just a few weeks
Depending on level of responsibility and the company, a designer may work to their own brief or be given a brief to work towards (including specifications in relation to colour and fabric) and develop a product from this.
Experienced designers with larger companies may focus more on the design aspect, with pattern cutters and machinists preparing sample garments. In smaller companies these, and other tasks, may be part of the designer's role.
287. Jewellery designer
Jewellery designers design and make body adornments using a variety of materials, including gold, silver and precious stones. Practising one of the oldest crafts, designers create pieces that can have great sentimental significance or symbolic meaning, can be wearable or are decorative artefacts in their own right. They must be able to relate well to their clients in order to understand design specifications, as well as master the creative and practical skills needed to make a product.
The majority of jewellery designers are self-employed so also require commercial awareness. Some jewellery designers focus more on design using specialist companies to provide the different stages of the making process.
When working for a company, a distinction is made between the design and the production. All the above activities are divided into separate job roles.
Promoting and developing the business is crucial for success as a self-employed jewellery designer. Many designers try to boost their reputation by networking, entering competitions and attending fairs. Other activities include consulting with galleries, store buyers and suppliers and researching jewellery and fashion trends.
288. Embroidery designer
Embroidery design is a needlework-based niche area of the crafts industry. Embroidery designers develop, plan and produce a range of works.
They may be employed by commercial organisations who produce embroidery kits for use at home, or they may work for themselves, running a workshop, or selling their products direct at craft markets or through contemporary art galleries
Public awareness of contemporary crafts is increasing and a small number of embroidery designers are employed by clothing labels, retailers and design companies
Much of the work is routine, though essential to the production of publications. This is the starting point for careers in editorial work. Progress may be possible within an organisation, but in small organisations it may be necessary to move to another employer once basic skills and experience have been acquired.
Editorial assistants perform a range of administrative and editorial tasks necessary to get publications published. They act as the liaison for the many people involved in a publication, from the receipt of texts from authors through to the handover to production staff.
289. Database administrator
A database administrator (DBA) is responsible for the performance, integrity and security of a database. Additional role requirements are likely to include planning, development and troubleshooting.
Database administrator (DBA) roles are increasingly identified by the databases and processes they administer and the capabilities of the database management system (DBMS) in use.
The work of a database administrator (DBA) varies according to the nature of the employing organisation and the level of responsibility associated with the post. The work may be pure maintenance or it may also involve specialising in database development.
data remains consistent across the database;
data is clearly defined;
users access data concurrently, in a form that suits their needs;
there is provision for data security and recovery control (all data is retrievable in an emergency).
290. Food technologist
Food technologists develop the manufacturing processes and recipes of food and drink products. They work on existing and newly discovered ingredients and technologies to invent new recipes and concepts and modify foods to create, for example, fat-free products and ready meals.
Food technologists are involved in conducting experiments and producing sample products, as well as designing the processes and machinery for making products with a consistent flavour, colour and texture in large quantities.
This must be done within a strict and ever-changing regulatory framework around the treatment of foodstuffs. The work may also involve building relationships with suppliers and customers, as well as ensuring products are profitable.
liaising and cooperating with technical and commercial colleagues in procurement, sales and technical services, and marketing and distribution, and also with official food inspection and hygiene agencies (this takes up a considerable proportion of time on the manufacturing side).
291. Community health doctor
Community health doctors are employed by primary care organisations to provide medical services needed to fill any gaps in the service provided by general practitioners (GPs). In many cases, they also offer an alternative for any patient who may not wish to see their GP.
The work is based in a community centre with other health care professionals. The range of services offered can include podiatry, counselling, dentistry, dietetics, physiotherapy and speech therapy, amongst others. They may also run specialist clinics at community centres, such as family planning, child health, diabetes and asthma. If a particular surgery is over-run or short-staffed, community health doctors may also run clinics for that practice.
Individual work with patients within community health can be similar to working in a practice. However, you will not have the same responsibilities as a GP, who has more organisational and administrative responsibilities concerning the running of the practice.
Community health doctors often get into the role through their vocational GP training. Whilst there are no formal community health specialist training programmes, you would be expected to have the relevant qualifications and training, for example, to run a family planning clinic you would need a certificate in family planning.
292. Multimedia specialist
Multimedia specialists are designers who combine design and technical knowledge to create information and communication technology (ICT) based products, such as: CD Roms;dvd.
When the design is complete, multimedia specialists use authoring software to arrange the files in a single program (to enable interactivity and navigation through the product content).They also test and adjust the product to deal with technical problems, and produce documentation describing the creation, content and processes of each file.
In designing products, multimedia specialists use industry-standard computer design packages such as Photoshop, Macromedia Director, Macromedia Flash and Flash 3D Animator, Director and Dreamweaver. Using these, and other computer packages, they are able to incorporate the work of other specialists, including writers, artists, animators, video producers, programmers and sound engineers, in the final product.
Depending on the complexity of the product, the authoring of files into a single program may be done by an assistant using hypertext mark up language (HTML), or by a software programmer using 'object oriented' programming languages such as Java or C++.
293. Graphic designer
Graphic designers produce design solutions to communicate their clients' messages with high visual impact. Graphic design solutions are required for a huge variety of products and activities, such as websites, advertising, books, magazines, posters, computer games, product packaging, exhibitions and displays, corporate communications and corporate identity, e.g. giving organisations a visual 'brand'.
Working to a brief agreed with the client or account manager, a graphic designer develops creative ideas and concepts, choosing the appropriate media and style to meet the client's objectives.
The work demands creative flair, up-to-date knowledge of industry software and a professional approach to time, costs and deadlines.
Whether they are self-employed, working freelance or employed within a business, designers often have to be proactive in presenting or 'pitching' their ideas and designs to prospective customers.
294. Early years teacher
Early years, or nursery, teachers work in pre-school, nursery and reception classes with children aged between three and five. They are responsible for developing and implementing work schemes and lesson plans in line with the requirements of the foundation stage. This involves organising and developing the nursery learning environment and resources in order to facilitate learning.
Early years teachers foster the understanding, social and communication skills of the children. They develop and maintain relationships with parents/guardians to further support pupils.
Early years teachers record observations and summarise the children's achievements. They focus on optimum child development and preparation for a successful transition to primary school education.
Early years teachers teach all areas of the foundation stage, which is focused on helping the children to achieve early learning goals. Typical activities include:
295. Ecologist
Ecologists are concerned with ecosystems as a whole and, within them, the abundance and distribution of organisms and the relationships between organisms and their environment. Ecologists carry out a wide range of tasks depending on their specialist knowledge (e.g. freshwater, marine, terrestrial, fauna, flora).
When starting out, ecologists often conduct surveys to identify, record and monitor species and their habitats. With career progression, work is likely to become more wide-ranging, with senior ecologists being more involved in policy and management work
Work commonly supports compliance with European and UK environmental legislation so ecologists must be aware of environmental policies and legislation.
The work of an ecologist depends on the nature of the employer and the purpose of the work. For example, environmental impact assessments are required by law for planning permission; the UK Biodiversity Action Plan at national and local level has given rise to comprehensive lists of species that need to be monitored and protected;
there is an increasing demand for the collection and management of biological information for national databases (see, for example, the National Biodiversity Network (NBN) ); and climate change is monitored by mapping the movement of key species.
296. Economist
Economists provide specialist advice based on the application of economic theory and knowledge. They do this by studying data and statistics and using their understanding of economic relationships to spot trends, carrying out considerable amounts of research and collecting large amounts of information.
They then analyse this data to assess feasibility, produce forecasts of future trends and make recommendations of ways to improve efficiency or take advantage of future activities.
Economists use specialist software and advanced methods in statistical analysis to assemble, sift and present this information, which is then used to advise businesses and other organisations, including government agencies.
Areas of research can cover any aspect of economic and social policy, ranging from interest rates, taxation and employment levels, to energy, health, transport and international development
297. Agricultural economist
An agricultural economist offers advice on economic trends with reference to agriculture and other rural economic contexts. Much of the work involves advising farmers on ways of improving their businesses.
monitoring economic changes in agriculture; devising methods and procedures for obtaining required data;understanding various sampling techniques that may be used to conduct different types of surveys.
understanding and interpreting data;applying models of economic behaviour to agricultural changes and developments;advising agricultural organisations on costs and benefits of options; advising government, employers or trade unions on the economic implications of agricultural policy options.
The range of activities an environmental education officer undertakes varies hugely from job to job. Some work mainly with schools, giving talks, taking part in and developing projects and delivering presentations in schools or hosting groups at relevant sites,
such as nature reserves. Others work with a wider range of age groups, for example leading guided nature walks for visitors or organising events and awareness campaigns. Many also work with and train volunteers and community groups involved in environmental work.
Particularly at a more senior level, you might also be involved in advising on and drafting environmental education policies and strategies for your organisation or the wider community.
298. Quantity surveyor
A quantity surveyor manages all costs relating to building projects, from the initial calculations to the final figures. Surveyors seek to minimise the costs of a project and enhance value for money, while still achieving the required standards and quality.
Many of these are specified by statutory building regulations, which the surveyor needs to understand and adhere to.
A quantity surveyor may work for either the client or the contractor, working in an office or on site. The title of the job may also be referred to as a construction cost consultant or commercial manager.offering advice on property taxation;
providing post-occupancy advice, facilities management services and life cycle costing advice;assisting clients in locating and accessing additional and alternative sources of funds,enabling clients to initiate construction projects;advising on the maintenance costs of specific buildings.
299. Commissioning editor
Commissioning editors identify and assess the publishing market, develop and support projects and authors, maintain a publishing programme and contribute to marketing and sales activities. In essence, their role is to develop a publisher's book list.
The role is most associated with book publishing. In magazine publishing, commissioning editors commission writers to produce articles and features. In academic journal publishing, contributions are obtained through an external editor.
This is a middle to senior level post requiring suitable experience and ability. An initial entrant is likely to be recruited to a more junior position, which may then lead on to a career as a commissioning editor.
Commissioning editors are involved with a book at every stage of its production. They are the key link between the manuscript and the published work. To develop their publisher's book list, commissioning editors research their field to learn about trends and gaps in the market. For this purpose.
300. Education inspector
The Office for Standards in Education, Children's Services and Skills (OFSTED) is the independent government body responsible for inspecting schools, local education authorities (LEAs) and some further education provision, as well as government initiatives, such as the national numeracy and literacy strategies.
Following the introduction of the new school inspection arrangements in September responsibility for the recruitment and training of inspectors now falls to regional inspection service providers (RISPs), who are working with Ofsted to deliver school inspections.
All Ofsted inspectors are now trained Her Majesty's Inspectors (HMIs) and are employed directly by Ofsted. For each inspection, a team of inspectors is assembled, which must be able to cover all areas of the curriculum and cross-curricular issues, such as special educational needs and equal opportunities. A team may consist of anything from two to inspectors.
Before being appointed, an inspector has to attend a course of training provided or approved by Ofsted. Most inspectors are or have been headteachers, deputy heads or heads of department. Each team of inspectors must be led by a registered inspector, who has passed an assessment of ability to lead the inspection and who has a legal duty to ensure the inspection is carried out properly.
301. Home based computer jobs
This is a great opportunity for stay-at-home moms, dads working two jobs, dead end jobs, labour workers, helpers, long commuters, students and on and on and on. Actually, its perfect for anyone who would like to get paid in the comfort of their own home! You can work whenever you want, no matter where you are. You can work from home whenever you want. The part you will like best is you can do it for 30 minutes or a few hours... you can fit it in whenever you have time and
you're not tied down for long periods of time. If you think you would rather stay home, pick your own hours, and spend time with the people you love, then this is the part time job for you! You will love the flexibility this job will give to your life! There are around 1050 Paying websites on Internet from which you can earn good income. First of all your have to register yourself with these companies. Registration is totally free in all 1050 companies that we provide you. Once you fill up a form for all these companies they will start sending
some e-mails to your email address which includes viewing of their advertisments, clicking on some banners, visiting some websites etc. This is simple work and can be done by any person having basic internet knowledge.For doing all these assignments,the Multinational Advertising companies would pay you. Number of paying websites is increasing every day as the viewer base is increasing around the globe.You are helping the advertising websites and they are returning you with this favour.They are able to pay you .
because their advertisers pay them every time they send you a promotion or advertisement. Each advertising websites has a contract with advertisers, that their product should be seen by large number of viewers all over the world. The Email you receive contains advertisement about a product or service or idea. The multinational companies want their advertisments to be viewed by large number of people all over the world. It creates many things like popularity, Brand Loyalty, Better image etc.
302 web designing job
Candidate should have very good working knowledge in Adobe Photoshop and all related Graphic design softwares 2. The person should have high degree of creativiy and must be able to convert the client's concepts into stunning flash and other animations for webpages 3. Candidate must be able to work in HTML/ CSS with full validation as per latest standards. 4. Candidate have to be able to generate original, modern and web 2.0 styled webpage templates, Banners, Headers,Buttons and Icons, Logos, etc
for websites and must work on other graphic designs for Print media. 5. The person must be able to meet our tight deadlines and must be fully dependable and committed in their work.Lots of creativity and a taste for design and usability. o Advanced Adobe Photoshop or Macromedia Fireworks Skills. o A great portfolio with previous experience designing websites, logos, banners. o Experience with HTML, CSS, Dreamweaver and general web design and development processes. o Patience for iterations and ability to ask questions to identify what is required from you.
Ethics and Ability to deliver the deliverables you promise on time . Extensive hands-on experience with Photoshop, HTML, Javascript, CSS, Front-page and Dreamweaver. Expert knowledge of HTML 4.0, XHTML, DHTML, CSS and JavaScript; strong knowledge of web standards is a must. Flash, XML, Web Server, AJAX, CPanel knowledge an advantage. Experience with Technologies like PHP, Asp.net or ASP is preferred. Excellent English, Speaking and Writing Skills. Should be a team player, Self Motivated & Vigilant. Enthusiastic and Self learner
Good knowledge in web design 2) Depth knowledge in the following areas - HTML, XML, Javascript, AJAX, CSS 3) Experience in photoshop, dreamweaver, flash, coreldraw., Develop web applications using PHP/MySQL.Understand, analyse & modify existing popular open source s. Have good knowledge of AJAX and OS commerceDemonstrate excellent skills in Linux, Apache, MySQL & PHP programming (LAMP). Design & maintain My SQL Databases & tables. Develop and maintain new websites Demonstrate previous experience with E-Commerce applications.303. Group head /publication writing job
To continue building and also operationally manage a Publication Writing Group that will ensure timely production of high quality publications (manuscripts, abstracts, short communications, review articles, publication alerts) and other written communication material (slide presentations, posters…etc) supporting Novartis brands, research activities and business needs. These would be meant for publication in scientific/medical journals or
for presentations in scientific/Health Authority meetings based on clinical study reports, other clinical documentation (e.g. reports of meta-analyses and clinical summary documents) and general scientific/medical literature. Publications and other communication material must conform to Novartis, Health Authorities and industry high ethical standards for transparency in communicating clinical research results as outlined by GCPs and industry guidelines as well as Novartis SOPs and
NIPs.Work also includes liaising with internal and external contributors, including authors & KOLs, and liaising with publication manager and/or medical/scientific journals to organize submission of publications. Assistance will also have to be provided for planning publication and communication strategies. Minimum requirements Minimum university post-graduate science degree or relevant equivalent background, including a medical degree;
Advanced knowledge of Clinical Research Processes & Drug Development with at least 5 years relevant experience;Expert knowledge in publication writing and publication management systems; good knowledge in other communication tools (poster production, slide-decks).At least 5 years relevant experience in the pharmaceutical industry or academia of which at least 2 years were in a senior managerial position;
304. Scientific writing job
Job Description: Develop Scientific medical content (Medical Writing)for all Projects, meeting the international quality standards Providing Medico-Marketing inputs for new product development and launches Preparation of training manuals and product monographs
Preparation of CME presentation slides for Doctors.Deliver content that is rated high on depth, comprehensiveness, quality and timelines interfacing with clients on a regular basis
Develop innovative content products by leveraging on the depth of scientific content.
Candidate Profile:Qualification MD / MBBS / Mpharm
You need to have a minimum of 1 year and over 3 years of experience pharmaceutical
Companies in any of the following functions � Medical Services / Medico-marketing / Clinical Research / Regulatory Affairs or in medical writing / developing / publishing medical content.
It would be preferred if you have a specialized understanding of any specialty area of medicine or an overall general understanding of the medical field . Must have strong written and oral communication / presentation skills .Should have passion for networking and updating with the latest technical / scientific developments and relate it to various projects. extremely well organized and Willingness to continually learn to improve the way we do things
Graduate in BA/MA (English) - Should have excellent command over written English and reading comprehension-Should be a team player with positive attitude and good work ethicsGood Convincing skills in English Good communication, interpersonal and analytical skills. Good in presentation Searching and researching.Assessment & Development.
The person should be enthusiastic, willing to work hard. Freshers may apply.
305. DATA CONVERSION JOBS
With the incompatibilities that exist between computer systems today, getting the data you need is not always that easy. If your organization needs to acquire or exchange data and has run into this obstacle, we can help. Our library of conversion software can process data generated by nearly all commercial PC software packages. We can convert any 3480/3490E cartridge or 9-track tape produced by a mainframe system to a PC compatible format. EBCDIC/ASCII conversion. We can help PC or
mainframe based organizations acquire the data needed in the proper format. Our in-house programming staff can handle special file conversion, parsing and translation. Custom reformatting, upper/lower case conversion, merge/purge processing and many other services. PDF has become the most widely used electronic format in litigation, technical & reference documents and research material. It is the preferred way to distribute documents or publish them to the web and
we can take you there for a fraction of in-house costs. No expensive equipment, software licenses and training, just high-quality results at very competitive prices. For data that exists only in printed form, Dataentrysindia has state-of-the-art scanning facilities and expert technicians to scan documents or images for conversion to digital format. Dataentrysindia also maintains a full editorial staff proficient in manual data entry, as well as proofing and editing, ensuring that the data entered is accurate.
If you are unsure of your data management needs and requirements, Dataentrysindia can help you identify them by performing a consulting engagement. If you already know your requirements and need data conversion, database design, or data output services, Dataentrysindia's full staff of data specialists can provide those services quickly, accurately, and cost-effectively.Data Conversion Regardless of the current state of your data, Dataentrysindia provides the necessary services to convert it to a platform-neutral, electronic format like XML that is suitable for importing into a data repository.
306. Data entry jobs
Dataentrysindia, offers much lower price rate for your data entry work, as our operating costs are much lower when compared to others. Dataentrysindia take special concern to estimate the correct pricing that is applied to client's data entry work requirements with sensitive turn-around times, and assuring true value to what you pay for.We can offer you our best prices for data entry projects, which are calculated on the following basis .
Volume of the order. Frequency of orders such as weekly, and monthly. The clarity and details provided in specifications. Required Accuracy.You have the choice for availing our special pricing scheme, applicable to all types of data entry work. We requires a delegate sample of the data prior to confirmation of the prices. The normal charges apply to easily legible data that is printed, typewritten or written in very neat handwriting in English.
We do not guarantee levels of accuracy for poor quality data such as very pale photocopies, very dark photocopies or difficult to decipher handwriting. The poor quality of script drastically slows down our operations. This also leads to increased charges. We undertake only data entry works that contain a minimum of 2 Megabytes. We are one of the initiative data entry companies in India maintaining "Universal Standard" for transferring data between different databases and
customizing data-entry input-output to meet and satisfy even the most demanding customer requirements. Dataentrysindia, one of the pioneer data entry companies in India, has adequate backup system, data security policy, and business continuity plans to ensure that the customer's data is safe and confidential as long as it is handled by us. Each member in the data entry team performs in a supervisory or managerial capacity, and is responsible for the data saved in the designated system.
307. Medical transcription jobs
One of a biggest benefits of a career in Medical transcription is which we can select a series of operative hours as great as how most work we can do. Companies concede we to set your report as great as this is an value generally for operative women who have to demeanour after a home or young kids as great as work as well. If we have been seeking for a transcription pursuit all we need to do is take up courses or
precision to get in to this margin as great as a small acid for a job. It competence be tough to find a right a single in a beginning, though once we get a right pursuit we will not bewail it. Transcription precision courses have been accessible for those who wish to benefit simple believe or do specialization in this field.Usually a turnaround time for a transcription is twenty-four hours. Medical Transcription is remunerative work as great as
Medical Transcription (MT) involves listening to audio recordings of reports, diagnoses, etc, commanded by doctors, physicians, or alternative illness caring professionals, as well as transcribing the audio files in to content format medical transcription involves formatting as well as typing to specific formats or criteria as well as transcribing the audio record of the studious caring info in to an simply entertaining created form.
Transcription, generally in MT, requires scold as well as scold spelling of healing terms, modifying healing conditions and/or errors in dictation. Transcriptionists additionally revise transcribed files as well as documents, imitation as well as lapse the accomplished papers upon time. Transcription reports as well should imitate with healing as well as authorised issues as well as concerns, procedures as well as policies, as well as impending laws per studious confidentiality.
308. Legal transcription job
To lead and manage team for Document review projects. To assist sales personnel on conference calls, face to face meetings, drafting RFIs and RFPs and writing SOWs. To manage delivery for law-firm client outsourcing relationships in the area of Document review. To serve as a liaison between clients, sales personnel and delivery/operations. To managing workflow to assure timely delivery of services. Quality control of Associates’ work
In the Legal jobs category GS Infocomm Data Pvt. Ltd. offers job vacancy for Legal Transcriber (Audio & Copy Typist) based at Delhi. An ideal candidate applying for this position need to possess Graduate in English (Hons) qualification with minimum .5 - 1 years experience. Candidates should have experience in Legal/General/Business Transcription. Good Computers/typing skills is must. Interested candidates may apply with their resume.
Transcription services are of various kinds which include generic transcription services in which you just have to give the presentation of the transcriptions and involve in a group discussion (GD). These are the set of works which does not entail the need of a specialized background and thus, falls into this category. And the other transcription services embrace works which necessitate a specialized background. Such as: Medical & Legal Transcription Services that requires thorough knowledge in relation the respective fields: Medical and Legal.
• Should not be reluctant when it comes to training session. • Good listening abilities and must be avid & voracious reader. • Ability to transcribe at least 60 audio minutes of recordings a day.• Good at typing speed. Must be able to type at least 55 words per minute (WPM). • Attribute of Internet Savvy besides good knowledge in relation to working on Microsoft Word and Excel. • Good proficiency in the language of English. Skills in vernacular of languages are going to be plus point
309. General transcription jobs
A monthly income is mandatory for survival, and a perfect career has got its infinite charms. Still, a full time career may not be everybody's cup of tea. There could be personal equations that prevent people from accepting a job that requires daily traveling, as well as submitting to the time schedules and other regimens of an office. This happens too often in women's case. A big problem solver for such people is what is known as Transcription. A Transcription job involves the converting of an audio recording into a printable document, and can be Medical Transcription services, Legal transcription services, or General transcription services. In all the three, fast typing skills with good grammar and punctuation, and also the ability to listen and decipher things correctly, are absolutely necessary. While these are the common requirements for any type of Transcription job, medical and legal transcription require certain additional skills as well
Audio recordings of the General Transcription category could be on any subject like debates, discussions, meetings, seminars, inaugural speeches, ceremonial functions, or radio and TV programs. No special subject knowledge is required for transcribing these tapes and cassettes. If the employing company is satisfied with your listening powers, deciphering abilities, typing skills, and language skills, you are likely to land the job.
The main attraction of a Transcription job is the fact that you can do it at any time from home, according to your convenience and in the pace you prefer to. Work can be found online and after finishing can be submitted online. You can just sit in your bedroom or study, and finish the whole job, instead of having to get dressed and drive or pedal or use the local train to get to your office. It will not make any difference to you whether it is snowing or raining outside.
310. CAD jobs
To plan, organize, and supervise the activities of a specialized departmental computer-aided drafting operation; formulate concepts for and develop new and modified computer drafting applications to meet engineering and other requirements; coordinate the work of drafting, design, and technical personnel; and to perform related work.This is a single position journey level class allocated only to the Department of Public Works. Under general supervision,
this position reports to the DPW Unit manager and is primarily responsible for planning and coordinating design information in relation to public works projects.
The examples of functions listed in this class specification are representative but not necessarily exhaustive or descriptive of any one position in the class. Management is not precluded from assigning other related functions not listed herein if such functions are a logical assignment for the position.
Plans, organizes, and coordinates the work of drafting, design, and technical staff in designing, implementing, and maintaining drafting and design data information systems for the department. Consults with staff to assess data processing needs including computer-aided design, and drafting. Prepares recommendations based on cost benefit analyses.
Coordinates with staff, end users and contractors to implement and maintain project and design information systems.
Schedules project design, testing, and implementation. Develops operating policies, procedures, and ensures compliance with departmental design standards. Presents formal design reviews at designated stages of development to management. Prepares computer center budget.
Trains drafting and design personnel. Validates blue prints and provides related calculations for engineering projects. Writes and maintains custom menus, Lisp and Visual Basic application programming.
311. Drafting jobs
Drafting jobs is a specialist job find portal servicing Structural Drafters / Design Drafters and Draftspersons specialising in Structures. Structural Design Drafters/ Drafters / Draftspersons from: Manager ,Lead ,Senior Drafters ,Mid level ,Junior levels ,Working for: ,Specialist Civil Structural firms ,Multidisciplinary Consultancies ,EPCM firms Commercial / Residential: Mid level Apartments ,High rise Apartments ,Hotels ,Commercial ,Sports and Recreational ,Health ,Retail ,Educational ,Buildings ,Facades ,Heritage .
Using construction methods and software including: AutoCAD ,3D modelling ,Civil Cad ,Revit ,Structural Steel ,Reinforced Concrete (RC) ,Post Tensioning (PT) , Pre-Cast , Masonry Structures Light industrial, Heavy industrial: Bridges ,Defence , Mining and metals, Minerals processing. Admittedly, drafting job descriptions for every position within a fundraising campaign is time consuming. Keep in mind, however, that the hardest work is in the initial drafting.
After that, descriptions can be revised and updated from year to year. One way to lessen the burden of the task is to ask each current job volunteer to draft the description at the completion of their service, then simply format them and proof for errors. This also ensures that the description is accurate and inclusive of all requirements. Volunteer job descriptions allow participants to plan their time and skills accurately and avert difficulties later on.
In time, the description will come to serve as a resource, a manual of sorts, for individual volunteer positions. Volunteer job descriptions quickly prove their worth. drafting each description takes time, and many fundraising leaders don’t know how to write them Works closely with engineers or designers to prepare and revise a wide variety of complex drawings and custom chip design layouts where use of initiative, judgment, extreme care and knowledge of company and/or DoD specifications are required.
312. 3D drafting jobs
3D Polygon modeling is the process of developing three-dimensional objects using a specialized software package designed for the purpose. These modeling applications provide an area of virtual space to work in, and a 3D model is developed in this space using rules of geometry measured by the Cartesian coordinates X, Y and Z to represent width, height and depth. 3D modeling software packages include: Wings 3D: A free 3D package that offers the basic modeling toolset.
A good entry level application. Blender: An open-source 3D modeling package, Blender is free to use. Blender has many more features than Wings 3D, and should be considered an intermediate application, as the Blender learning curve is quite difficult to absorb for a beginner 3D modeler. Milkshape by Chumbalumsoft: Initially designed for modding the game Half-Life, Milkshape has become the 3D modeller of choice for many independent game developers. It retails at a very reasonable $35.
Autodesk 3D Studio Max: A comprehensive modeling package aimed at professional studios rather than private individuals. The latest version of 3D Studio Max retails for $3495. It is possible to purchase earlier versions for a reduced price. Maxon Cinema 4D: Another accomplished modeling package, the latest release of Maxon's application is priced at $3695 for the intermediate software bundle. Autodesk Maya: 3D Studio Max's traditional rival, Maya is another premium modelling package.
3D Modeling Skills in Demand- 3D modeling is a demanding discipline, covering a variety of mathematical and artistic abilities. However, there is a growing demand for skilled creators of 3D content in a vast variety of fields. As a hobby, it is vastly entertaining. As a career option, 3D polygon modeling can be very rewarding 3D polygon modeling is used for movie special effects, for video game content and to make photorealistic renders for print and the web313. Broadbanding jobs
Broadbanding is the grouping of jobs with similar duties, responsibilities, and levels of accountability. Broadbands widen salary ranges in order to facilitate organizational flexibility, encourage individual career development, and market competitiveness. The use of broadbanding also reduces the number of job classifications. Broadband classifications are generic in scope and describe the typical duties performed as well as the knowledge
and skills essential to the classification. The site-specific duties that belong to each of the jobs included within that classification are documented on the Position Description Questionnaire form. Broadband job titles are very general in nature and are designed to identify and define the job family. They also provide a consistent and common standard by which these families can be utilized across departments and offices within the University.
A department may certainly utilize more descriptive internal working titles on business cards and correspondence. For example, in the broadband classification of IT Analyst I, an employee may use the working title of Web Developer. Hourly and monthly pay bands have been developed to allow flexibility in setting salaries competitive with others in similar job classifications drawn from the relevant labor market from which we recruit.
OU Human Resources is committed to support the University of Oklahoma’s ability to attract and retain a highly qualified and diverse staff. Broadbanding - an option that is becoming very popular is one that will simplify the pay plan substantially using bands. In the past decade much discussion has been given to using a broadbanding approach to meet compensation challenges. Broadbanding is defined as a strategy for salary structures that consolidate a large number of pay grades into a few "broad bands."
314. Business development Officer Job.
Responsible for International Sales & Marketing for Software services, ERP, CRM etc in assigned territory. Should manage existing accounts/ clients and identify/ create new accounts/ clients. Responsible for complete sales process, payments. Analyses market trends, recommend changes to marketing and business development strategies based on analysis and feedback. Prepare tender bids. Should prepare and submit MIS &
other report in time. Should be willing to travel extensively overseas. Build and maintain a high performance culture through effective performance management, communication and coaching of staff. Manage multiple projects, prioritize work and balance strategic and tactical issues. Establish appropriate remuneration levels and performance based conditions for the business development team.DESIRED PROFILE- Should be graduate, preferably MBA with very good knowledge of at least one European language (fluently speak, understand & write).1-2 years of work experience in International call centre BD or Software services sales. Excellent client or customer interfacing skills. Excellent analytical skills to capture clients requirements and relate to organization's solutions.
Successful track record of developing win-win partnerships in competitive marketplaces.Thorough knowledge in understanding business implications of new and existing partnerships and implications on business strategies.Advanced skills to seamlessly integrate client needs with organizations service offerings to attain revenue goals. Excellent skills in negotiation, contracts, forecasting and strategic planning in a business environment.
315. sales and marketing manager job
To look after sales of Speciality products which are sold to be Personal Care and FMCG Industry locally as well as exports, handling of current sales, business development, increasing distribution network world over and new product identification and market research. Require a very dynamic and sincere candiadate with international exposure to the Personal Care Industry. He will be directly dealing with customers (MNC & Non-MNC), distributors worldwide and stockist.
He will have to service the current customers as well as develop new customers and distributors worldwide. He will also interact with R&D Dept., to suggest & develop new product in this industry. Requirements: 1)Must be B.Sc -Chemistry graduate 2) Must have done MBA-Marketing3)Should have 7 to 8 yrs expeeince in the chemical company of Speciality chemicals - specially in cosmetic and Personal care-FMCG Products4)should have exposure to domestic and internatiional marketSalary is not a constraint to the right candidate the company is the manufacturer and supplier of Speciality hcemicals to the Pharma, chemical , Cosmetic and Personal care companies.The company from its inception in 1978 has made itself the acknowledged leader in the manufacturing of various bulk drug intermediates and personal care ingredients. Today, the company is the preferred supplier to pharmaceutical giants and Personal Care Industries
and has impressive list of customers in domestic market and globally as well. The comapny has achieved this with quality products, good business practices, and an impeccable reputation as a reliable supplier with strict adherence to delivery schedules and quality. Domestic Market, international market, Cosmatic, Personal care, chemicals, Good coomunication and good liasoning.
Specialization:Channel Sales,Corporate Sales,International Business.Job Function:Sales/Business Development
316. Regional sales Manager Job.
Duties and Responsibilities: Responsible for Sales in all aspects of work actively by sourcing hot leads for the organization. Ability to achieve targeted revenue goals. To be involved with product promotion and development, identification of new business opportunities and the development of marketing strategies. Keeping in mind the Organizational Standards and Goals.Profile: A go getter and Display a high standard of professional acumen and
delivering a great levels of Promotion in Sales.Responsible for capturing sales through inquiries, Events and other sources.Accept direction and implement sales skills. Support in Managing, Planning and Controlling the day-to-day operation of the Sales Department to maintain efficient and optimum sales standards. Maintain regular contacts with all departments and where appropriate submit sales leads within LEONIA Organization.
Job Responsibilities : To support LEONIA philosophy towards Guest Satisfaction, quality enhancement. Be a Team Player for constructive advantage. Assist in the development of the Sales and Promotional Plan. Establish a regular call pattern for meeting with principles of target markets.Participate in trade shows, conventions and promotional events within the hotel, the industry and customer organizations as necessary and directed. Prepare weekly, monthly,
quarterly Sales reports as per the required formats. Liaise closely with other departments within the hotel to ensure efficient and regular communication of sales activities. Actively source new Accounts and develop existing accounts that will meet the yield objective of LEONIA from these segments. Assist the Manager - Sales in developing strategic sales plans, detailing sales, distribution plans, trade shows and other promotional opportunities in order to meet the LEONIAs overall strategic plan.
317. Inside sales head job
Responsibilities:Head the Inside Sales and lead generation Team. Ensuring the Inside sales team works to their full potential towards achieving the team target Responsible for Identifying, scheduling and monitoring Lead Generation Campaigns to prospect, qualify, develop, and document potential marketing opportunities for all services.Capture pertinent customer data elements to increase knowledge of the prospect and work with field sales and
subject matter experts to tweak the sales pitch.Relationship management activities to business prospects in assigned territory.Help coordinate pre-trade show calling in marketing to attract trade show attendees to Company s booth.Provide post-trade show support to follow up with leads generated at the show. Data mining for prospective leads . Responsible for acquiring new accounts and growing existing accounts to full potential, generating maximum revenue on a long-term basis
Arrange to develop and administer marketing database which includes client and prospect information through various website search. Preparation and presentation of Quarterly Inside Sales report with detailed analysis on team performance. Forecasting the various key accounts in the funnel. Prerequisites: Experience should be between 10 to 14 years with relevant experience in heading inside sale team for reputed software services companies. Must have worked FOR US
and Europe Markets recently.Must have extensive team heading experience.Should be excellent in Lead generation campaign.Data mining skills to identify prospective leads.Work Location: ChennaiSkills: Inside Sales, Lead Generation, Team Management . Specialization: IT/Telecommunications,International Business,Telesales. Job Function: Sales/Business Development. Industry: IT-Software, Software Services.
318. Agency manager job for MNC.
Key Skills: MAN MANAGEMENT . Specialization: Banquet Sales,Client Servicing,Counter Sales,Healthcare Sales,International Business,Relationship/Account Servicing. Job Function:
Sales/Business Development. Industry: Financial Services/Stockbroking, Gems & Jewellery, Real Estate. Qualification: School & Graduation -Level: Middle - Manager, Assistant Manager
JOB DESCRIPTION: -Sales Manager should identify potential advisors and
recruit them to achieve minimum expected sales target on a consistent basis-Identify and recruit potential advisors who are highly productive and in accordance with the business plan within stipulated timeframe.-Achieve expected minimum sales target on a consistent basis through own team of advisors. -Continuously review performance and set target for the team and consistently raise them. -Provide training and monitor developmental needs of
advisors in the team , provide feedback on advisors' performance , particularly those with less experience ( Mentor role ) .ELIGIBILITY: - Graduate with 2-5 yrs experience in retail sales- Experience of sales in Financial industry would be an added advantage.- Consistently good sales record or sales management role with potential to lead others.- Strong leadership qualities and ability to coach and mentor others.- Excellent verbal and written communication skills.
continuously review performance of own team of Financial Consultants and to improve their productivity on an on going basis. Provide leadership and management to the team of Financial Consultants. - Aggressive with drive and high energy levels who show potential to mentor / lead a team of advisors. - Good networking skills to drive and generate leads to recruit influential advisors Provide training and monitor development needs of Financial consultants in the team.
319. Sales manager job
Key Skills: SALES, CHANNEL SALES, RETAIL SALES, DIRECT SALES, CONCEPT SALES, INSTITUTIONAL SALES, CORPORATE SALES, AREA SALES MANAGERS, KEY ACCOUNT MANAGER, TERRITORY MANAG . Specialization: Advertising/Media/Arts,IT /Telecommunications,International Business,Logistics/Transport/Supply,Merchandising,Retail Sales. Job Function: Sales/Business Development. Industry: Advertising/PR/Event Management,
Identify potential Financial Consultants (Insurance Agents) based on agreed profiles and ensure recruitment in accordance with the business plan. Initially, for the first three months to achieve targets of direct sales and recruit the minimum expected number of Financial Consultants. To identify more Financial Consultants and to complete recruitment of the financial consultants as expected and communicated by the management, within the given time frame.
To also achieve minimum expected sales target on a consistent basis through own team of Financial Consultants. To continuously review performance of own team of Financial Consultants and to improve their productivity on an on going basis. Provide leadership and management to the team of Financial Consultants. To endeavor to have and to achieve a team of highly productive Financial Consultants. Provide training and monitor development needs of Financial consultants in the team;
provide feedback on Financial consultant's performance, particularly those with less experience. (Mentor role) Set targets for consultants, review these & consistently raise them. Good academic record. Excellent professional track record. Experience of sales in a Financial services industry would be an added advantage. Strong Leadership qualities and ability to coach and mentor others.Knowledge of the market place. Excellent verbal and written communication skills
320. National marketing manager job
Pinpoint Pty Ltd is one of the largest marketing companies in the Asia Pacific Region and is creating an Indian Country Team who will be charged with launching an exciting new Loyalty Program. You can join this groundbreaking Indian country team at such a fascinating time. We are currently seeking an experienced, innovative, dynamic Marketer to lead the Marketing team. You should like a challenge and value the experience available in working in a start up environment.
Come on board at this programs development phase, bring it to launch in 2009 and manage it to achieve great success beyond. This senior Marketing role includes: Client ManagementDevelop effective client relationships with high profile Card Issuers and other leading Partners to develop and implement successful marketing strategies; Support clear lines of communication between Pinpoint and Clients at a range of levels to ensure both parties needs are being met.
Marketing - Prepare and implement marketing plans and tactical campaigns . Look for opportunities for further leverage and new directions. Prepare results of marketing initiatives for Clients, Partners and the Executive Board. Program Management: Budget development and monitoring of the P & L;Goal setting and monitoring of key performance indicators;Reporting and monitoring Liaising with Customer Service and external Fulfilment providers to ensure day to day operations of the program are on-track.
Staff Management: Managing direct reports to ensure good relationships, skill development and clear / open lines of communication. Weekly WIP meetings with direct reports to monitor progress and discuss prioritisation, tasks and deadlines. Oversee the day to day activities of direct reports including Account Management, marketing collateral production, supplier management.You will bring to Pinpoint:Your experienced Marketing skills ( Post Grad qualified) 321. Chief investment officer.
key Skills: Investment Life Cycle, Investment Process , Specialization:Investment banking,Investor Relationship.Job Function:Finance,Industry:Banking, Financial Services/Stockbroking
Position: Chief Investment Officer ,Reporting to: Managing Director & CEOCompany is recruiting a CIO for its recently launched Q India Private Equity Fund that will focus on investments in logistics, energy, tourism-related infrastructure and urban development in India.Roles and Responsibilities: Will be expected to manage all aspects of the funds investments through the investment life cycle (includes deal sourcing, investment structuring, monitoring and exit).Given that this is a Greenfield venture the CIO will be expected to establish the investment process, which will include building frameworks to analyze, approve and present investments to the investment committee. In addition, the CIO will be charged with building an investment team and
liaising with the Mauritius-based management company. Furthermore, the job includes building and maintaining relationships with the Fund s investors. Experience: The potential candidate should have:10-15 years of experience in identifying, evaluating and executing investments in the infrastructure or related spaces.On the investment side this experience includes deal sourcing, undertaking due diligence, performing risk assessments and investment structuring.The individual should have experience in monitoring and providing feedback to live investments as appropriate. On the exit side the experience includes formulating and executing appropriate exit strategies.Entire deal cycle experience will be preferable. Qualifications: Preferably CA, CFA or a MBA from a reputed institute. There are probably over 50,000 money managers in the world. Of which there are an estimated 30,000 in USA. India has maybe 40 money-managers.
322. Personal secretary job
Specialization:Administration,Office Management & Coordination,Fronto ffice,Personal/ Secretarial,Receptionists,Stenography. Job Function: Administration,Front Office Staff/Secretarial/Computer Operator. Industry: Accounting-Tax/Consulting. Required Front Office cum Secretary for a Fast Growing Financial Management Comapny at Jhandewalan Extn.(Delhi) Only matured female candidate need to apply for the job. Can handel Front Office Activities.
-Good Communcition skill, good command in English. good typing,dictation skills-Well conversant, handling customer queries with patience. Must be 55% and above from 10th class onwards. Desired Candidate Profile - We are looking for Front Office cum Secretary Female candidate who:- -Good Communication and Presentable skills, - Matured lady between 25yrs to 30 yrs, - good knowledge of taking dictations, -Good in typing skills, -can draft the letter by own
Note-Must be 55% and above from 10th class onwards. Company Details: KKCSL's endeavour is to make our clients comfortable so that they can remain focused on their business at all times. We are dedicated to World Class Services through continual growth & expertise, our team provides clients with superior level of services & skill that has generated a reputation for excellence. We take the genuine interest in providing the best possible result at the best possible value. The Company has its four departments namely Business Finance / Financial Consultancy Department, Proprietory Audit/ System Audit Department, Business Advisory Services, Tax Planning & Compliances Dept at Corporate office, Delhi Branch and soon opening at Mumbai. The Company has a plan to be present globally. The Company's assets and strengths are its talented pool of staffs. The firm offers structured fund management and project financing from bank. 323. Application developer job
The CIA has ongoing requirements for entry-level and experienced Computer Scientists, Software Engineers and Web Designers and Publishers. Working in one of several offices that directly support analysis, intelligence collection and other business areas, candidates will analyze, develop and deploy innovative information/software systems and capabilities to enhance the CIA's capabilities to collect, produce and disseminate intelligence. Unique state-of-the-art solutions will
be created to reduce information overload through innovative information exploitation, enhance intelligence production capabilities, expand internal and external collaboration services and evolve an expansive intelligence knowledge base. In these positions, candidates will directly participate in team environments and via structured development lifecycles, analyze and define local and/or enterprise information system requirements, perform system/application design, develop capability
prototypes, and develop and implement operational information systems. These positions also offer opportunities for hands-on research and exploration of leading-edge commercial technologies through application/integration of technology in delivering IT solutions. Employees within the organization have opportunities for additional salary advancement to the senior level.Minimum qualifications include the following: a BS/BA in computer science, computer engineering,
information systems, initiative, creativity, integrity, technical excellence and strong interpersonal and communications skills. A GPA of at least 3.0 on a 4.0 scale is also required.Experience in one or more of the following areas is essential:Programming and Related Experience (not all required): Java, C, C++, Perl, Visual Basic, Oracle, MS Office, Lotus Notes/Domino, Java-based development tools (JBuilder), Excalibur RetrievalWare, Internet/Website and Content Management Technologies
324. Senior applications engineer job.
What This Position Will Offer You:Help develop one of TIME Magazines 50 best inventions of 2008. A-Space, an application that ManTech developed for the Director of National Intelligence (DNI) and the Defense Intelligence Agency (DIA) Modeled after popular sites like Facebook and MySpace and often referred to as a "social network for spies," A-Space enables intelligence analysts to share information more freely, collaborate across agency lines, and connect in ways unseen to most. How You Will Support This Program:• Analyzes and studies complex system requirements. Designs software tools and subsystems to support software reuse and domain analyses and manages their implementation. Manages software development and support using formal specifications, data flow diagrams, other accepted design techniques and, when appropriate, computer Aided Software Engineering (CASE) tools. Estimates software development costs and schedule. Reviews existing programs and
assists in making refinements, reducing operating time, and improving current techniques. Supervises software configuration management. Analyzes functional business applications and design specifications for functional areas such as payroll, logistics and contracts. Develops block diagrams and logic flow charts. Translates detailed design into computer software. Tests, debugs, and refines the computer software to produce the required product. Prepares required documentation, including both program-level and user-level documentation.
Enhances software to reduce operating time or improve efficiency. Provides technical direction to programmers as required ensuring program deadlines are met. • The selected candidate will be responsible for developing Web 2.0 based applications and framework API’s for A-Space. We are looking for a innovative and driven candidate who will be responsible for research, design and development for web-based and framework-based software systems. The selected candidate will be consulting with hardware engineers.
325. The Customer Project Specialist job
Applications (CPS) will provide support for Siemens Middleware Solutions and will be responsible for providing the on-site support for system configuration, including pre-installation, installation, training, troubleshooting and post-implementation support of Siemens Healthcare Diagnostics data management and network connectivity middleware products (commonly known as Informatics). The CPS will provide support for Siemens Middleware Solutions. The areas of responsibilities are
to work closely with Siemens Healthcare Diagnostics first and second level employees as well as management. The CPS must respond to the most difficult system problems, counseling customers, Sales Representatives, Technical Applications Specialists (TAS) and Field Service Engineers on corrective and preventative measures. The CPS will be integral to the planning of an installation, instrument setup, training of lab personnel, and creation of learning tools such as materials /
troubleshooting guides and standardization of operating procedures. The CPS will demonstrate the leadership and expertise necessary to train develop and mentor TAS's. The CPS position assumes a leadership role in the Technical Applications Organization. Therefore, the CPS will be regionally based in Washington, DC, but recognized as a national resource and will be required to support any of customers across North America. This may require extensive travel to meet the needs of our
customers or the needs of the organization. The individual will routinely interact directly with customers including Laboratory Managers, Supervisors and IT Administrators as well as external LIS Vendors and 3rd Party Middleware Companies.SPECIFIC RESPONSIBILITIES INCLUDE:Performing pre-implementation activities including support of the sales process including meetings and demonstrations, participating in pre-installation meetings with customers to discuss.
326. Software development engineer job.
The incumbent serves as a senior applications software developer in support of OIG. Specifically, the incumbent is responsible for all aspects of software development of in house managed projects. For outsourced software development projects, the incumbent actively participates in the development of the acquisition package, in design reviews, and in software testing. The incumbent maintains all developed software in use at OIG, to included associated hardware. Specific duties ill include:
Assists users in translating their information technology requirements into formal written documents that form the basis for contract statements of work. Develops costs and schedule estimates to complete software development tasks. Creates test plans and monitors user acceptance testing to ensure developed software meets or exceeds all approved requirements. Maintains configuration management control throughout the software's lifecycle. Serves as a member of contract source selection teams to evaluate contractor proposals.
Actively participates in software design reviews. Exercises sound project management principles and best practices throughout the development process to keep the project within budget and on schedule.
Manages multiple concurrent projects for establishing and implementing strategies for software applications. Manages and coordinates the design and deployment of new relevant hardware and
software with other enterprise IT projects. Provides a broad assessment of the scope and
resource requirements of assigned projects, including specific results and outcomes. Identifies project issues and risks and makes recommendations on how to best resolve or mitigate them. Provides regular progress reports and briefings that are clear, concise, and well organized. Manages all commercial off-the-shelf software and tools used to develop and support OIG applications to ensure all licenses. required support contracts are current and that software is patched to the latest approved version.
327. COMPUTER / ELECTRICAL ENGINEERS job
Computer/Electrical Engineers at the National Security Agency (NSA) push the limits of the most advanced technologies in pursuit of technological breakthroughs that help ensure the Nation’s information superiority.Computer/Electrical Engineering opportunities range from fundamental research through advanced development, small to large system design and prototype development, developmental test and evaluation, field installation, and operational support.
Responsibilities:At NSA, you will be involved in different projects designed to enhance your professional growth. No matter what career path you follow, you will experience the demanding and exhilarating challenges unavailable to those in the private sector.
Computer/Electrical Engineering career paths include:Design of special-purpose computers and antenna systems ,Pattern recognition technologies , Signals analysis , Optics , Design, development, and testing of electronic communications .
You will be involved in multiple stages of projects, including requirements analysis, design, simulation, experimentation, benchwork, prototype development and testing, manufacturing, and possibly field work. Technical Skills:The following technical skills are needed throughout NSA:
Network Engineering – Design/analysis of LANs/WANs, routers, switches, firewalls, protocol
Software Engineering – JAVA, C++, XML, HTML, Web applications, object oriented analysis
and design, rapid prototyping, algorithm development Communications – Digital and analog, fixed and mobile wireless, satellite, antenna design. Systems Engineering – End-to-end real-time operating systems, signals processing, VHDL/hardware development.
Microelectronics – VHDL, FPGA, microelectronic manufacturing and testing (MCM, SOC), electronic packaging, VLSI. Job Requirements .Qualifications:
328. Software Applications Developer job
Essential Job Functions . Great opportunity for software application developer to make significant contributions to NOAA's Weather Satellite Programs. The candidate will be responsible for designing and developing automated applications to meet highly complex business needs. Codes, tests, debugs, implements, and documents highly complex programs. Develops complex test plans to verify logic of new or modified programs. Prepares detailed specifications from which programs are developed
and coded. Creates appropriate documentation in work assignments such as program code, and technical documentation. Gathers information from existing systems, analyzes program and time requirements.. Provides technical advice on complex programming. Plans and designs systems modeling, simulation and analysis for project(s) crossing multiple product lines or major phase of significant projects. Participates as integral part of design team, coordinates engineers and
support staff in project efforts. Performs highly complex testing and research of software systems to enhance performance or investigate and resolve matters of significance. Monitors and oversees the completion and implementation of technical products to ensure success and timeliness. Reviews literature, patents and current practices relevant to the solution of highly complex projects. Identifies, recommends and pursues technology/practices to apply to solution.
Recommends and implements corrections in highly complex technical applications and analysis to enhance performance. Conducts cost analyses and evaluates vendor capabilities to provide the most complex required products or services. Recommends vendor(s) and approach and presents to senior management/customer as appropriate. Provides leadership and work guidance to less experienced personnel. Provides complex technical consultation to other organizations; interacts with senior customer personnel and internal senior management.
329. Programming developer job
Great opportunity for programming developer to make significant contributions to NOAA's Weather Satellite Programs. The candidate will be responsible for designing and developing automated applications to meet highly complex business needs. Codes, tests, debugs, implements, and documents highly complex programs. Develops complex test plans to verify logic of new or modified programs. Prepares detailed specifications from which programs are developed
and coded. Creates appropriate documentation in work assignments such as program code, and technical documentation. Gathers information from existing systems, analyzes program and time requirements.. Provides technical advice on complex programming. Plans and designs systems modeling, simulation and analysis for project(s) crossing multiple product lines or major phase of significant projects. Participates as integral part of design team, coordinates engineers
and support staff in project efforts. Performs highly complex testing and research of software systems to enhance performance or investigate and resolve matters of significance. Monitors and oversees the completion and implementation of technical products to ensure success and timeliness. Reviews literature, patents and current practices relevant to the solution of highly complex projects. Identifies, recommends and pursues technology/practices to apply to solution.
Recommends and implements corrections in highly complex technical applications and analysis to enhance performance. Conducts cost analyses and evaluates vendor capabilities to provide the most complex required products or services. Recommends vendor(s) and approach and presents to senior management/customer as appropriate. Provides leadership and work guidance to less experienced personnel. Provides complex technical consultation to other organizations; interacts with senior customer personnel and internal senior management.
330. Business Development Senior Manager job.
Systems Integration and Technology (SI&T) offers a full range of global delivery services-from enterprise solutions, system integration, technical architectures, business intelligence, infrastructure consulting, and technologyresearch/development. Our SI&T consultants can expect to: work with cutting edge technology , deliver high-quality solutions across multiple industries , work on a variety of projects ranging in both size and scope , receive continuous training , gain rapid career progression .
Accenture Technology Consulting (ATC) falls within the SI&T organization and provides services aimed at improving the strategic intent, transforming economics and effectiveness, and the overall design, of a company's information technology capabilities and/or functions. Information technology capabilities include IT planning and management, IT processes and approaches, and IT assets that provide the operational components necessary to run applications.Accenture's Security practice helps organizations work through
complex business and technology issues to provide a straightforward approach to information and network security. Our security professionals bring deep technology skills and industry knowledge to any organization and work closely with clients to design and implement a security solution closely tied to business objectives. Our Security Technology professionals provide information security design and implementation services that are unique to SAP Security, Identity and Access Management, and Data Security.
We provide support developing information security strategies for the enterprise, security tool implementation experience, secure Web portal development, experience implementing provisioning tools, project management, presentation development, security information analysis, and understanding of security technologies. We are currently searching for Security Technology Senior Manager for our Security Technologypractice. KeyResponsibilities - - Maintain positive rapport with client relationship through effectivecommunications .
331. SAP Business Warehouse (BW) Backend Developer, job
People within the Client Operations-Application Outsourcing (AO) workgroup are responsible for the day-to-day provision of long-term outsourcing services to one or several clients. Client Operations is where the majority of Services people reside and typically, people in the workgroup are based permanently at a client location. Our AO Resources can expect to: Incorporate skills which support the programming, management and maintenance services required to implement and support new installations or maintain and
improve existing Legacy systems for our clients. Receive ongoing training to build and extend professional, technical and management skills in all areas. Enjoy our comprehensive and generous benefits package.Job Description:will work closely with Accenture, Teaming Partner, and client team members and executives to support Sustainment mission and oversee completion deliverables on schedule and with high quality. Key Responsibilities: Manage daily data load monitoring and performance tuning,
Manage resolution of data load errors, Manage overall implementation of client System change request (SCR) work from requirements gathering to post production support, Manage development of SCR functional designs for each new or modified system component as required, Manage SCR testing (unit, system, regression, performance and user acceptance, Identify and coordinate SCR production deployment activities, Manage SCR post production support, Lead/coordinate support for cross-team impacts as a result of SCRs
Support/collaborate with related functional teams, Provide periodic budget forecasts for SCR Sub team, Support project estimating and work planning for SCR activities, Will carry an on-call pager on a rotating basis , Must submit to a Department of Defense background investigation, Professional Skills -Strong written and verbal communication skills , Ability to learn functional and technical content quickly , Strong facilitation skills , Strong client interaction skills , Excellent problem solving skills ,Ability to work well in a team setting ,Strong teamwork capabilities and ability to work in a Matrix management environment
332.Tech Quality Functional Analyst job
People within the Client Operations-Application Outsourcing (AO) workgroup are responsible for the day-to-day provision of long-term outsourcing services to one or several clients. Client Operations is where the majority of Services people reside and typically, people in the workgroup are based permanently at a client location..Our AO Resources can expect to: Incorporate skills which support the programming, management and maintenance services required to implement and support new installations or maintain and
Improve existing Legacy systems for our clients. Receive ongoing training to build and extend professional, technical and management skills in all areas. Enjoy our comprehensive and generous benefits package. The ANSS - Tech Quality Functional Analyst is responsible for all participating in all Tech Qualilty System Change Request activities including FPL management, Hotline/Warmline Management, Release and Transition Support and Production Support related to client needs.Key Responsibilities:Support sustainment mission and complete deliverables on schedule and with high qualilty , Provide system change request (SCR) production support , Investigate related system transactional errors , Analyze and clarify hotlines/war lines (break/fix requests) ,Analyze and clarify SCRs with assistance ,Support SCRs day-to-day user requests,Provide SCR end-user training ,Support SCR functional areas as assigned ,Some limited travel to other client sites may be required ,Will carry an on-call pager on a rotating basis ,
Must submit to a Department of Defense Background Investigation. Preferred Skills/Qualifications:1 year SAP MM experience Familiarity with SAP IDOC processing .Understanding of Supply Chain processes ,Production Support experience ,Strong MS Excel skills ,Government security clearance, Strong written and verbal communication skills ,Ability to learn functional and technical content quickly ,Strong facilitation skills ,Strong client interaction skills ,Excellent problem solving skills Ability.
333.Capacity Planning and Performance Operations Architect job.
People within the Client Operations - IO workgroup are responsible for the day-to-day provision of long-term outsourcing services to one or several clients. Client Operations is where the majority of Services people reside and typically, people in the workgroup are based permanently at a client location. Infrastructure Outsourcing (IO) is the ongoing management and/or improvement of activities related to a part of to the whole of a technical infrastructure (eg. information systems, security systems, telecommunication networks),
Allowing companies to focus on their core business and competencies.Our IO Resources can expect to:Incorporate skills which support the programming, management and maintenance services required to implement and support new installations or maintain and improve existing Legacy systems for our clients. Receive ongoing training to build and extend professional, technical and management skills in all areas. Enjoy our comprehensive and generous benefits package.Position Description:
The Capacity Planning and Performance Operations Architect will support the client production environment in support of their customer with emphasis on infrastructure application support. This role is part of a team focused on creation of maintenance scheduling; support of incident, change and problem management based on ITIL processes; communication management; support for application and database management; storage architecture and other tasks focused in the area of infrastructure support.
This role will focus largely on Application Architecture as well as Application Performance Management Position Responsibilities: Accountable for analysis of system/application performance metrics reports, and recommendations based on findings with the technical teams to achieve optimized performance. Support Disaster Recovery and Continuity of Operations activities by understanding provider capabilities and mapping customer architectures. Assist data center team responsible for implementing all production change
334. Sustainment Procurement Development Analyst job.
People within the Client Operations-Application Outsourcing (AO) workgroup are responsible for the day-to-day provision of long-term outsourcing services to one or several clients. Client Operations is where the majority of Services people reside and typically, people in the workgroup are based permanently at a client location. Our AO Resources can expect to Incorporate skills which support the programming, management and maintenance services required to implement and support new installations or maintain and
improve existing Legacy systems for our clients. Receive ongoing training to build and extend professional, technical and management skills in all areas. Enjoy our comprehensive and generous benefits package. Job Description:The ANSS - The Sustainment Procurement Development Analyst is primarily responsible for designing, development, unit testing, and assembly testing custom development objects integrated with the SAP Materials Management (MM) module. Key Responsibilities: Work closely with team members to
support sustainment mission and complete client deliverables on schedule and with high quality , Provide system change requests (SCR) batch error resolution and performance tuning support , Provide Capital SCR batch error resolution and performance tuning support, Provide on-call support rotation, Resolve Capital SCR production development issues,Develop and test Capital SCR production break/fixes and enhancements,Establish and update technical specifications and requirements,Escalating issues
to team leads/managers in a timely manner,Support Capital SCR technical design (TD), coding, unit testing, and assembly testing for each system component requiring modifications in the detailed functional requirement,Identify and coordinate production preparation activities in support of Capital SCR FPL, Hotlines, and Warm lines, Provide post production support in support of Capital SCR FPL, Hotlines, and Warm lines, Some limited travel to other client sites may be required ,Will carry an on-call pager on a rotating basis.
335. Order Fulfilment Development Analyst job
People within the Client Operations-Application Outsourcing (AO) workgroup are responsible for the day-to-day provision of long-term outsourcing services to one or several clients. Client Operations is where the majority of Services people reside and typically, people in the workgroup are based permanently at a client location. Our AO Resources can expect to: Incorporate skills which support the programming, management and maintenance services required to implement and support new installations or maintain and
improve existing Legacy systems for our clients. Receive ongoing training to build and extend professional, technical and management skills in all areas. Enjoy our comprehensive and generous benefits package. Job Description: The ANSS - Order Fulfillment Development Analyst is responsible for designing, development, unit testing and assembly testing custom development objects integrated with the SAP Sales and Distribution (SD) and Materials Management (MM) modules. Key Responsibilities:Work closely with
team members to support sustainment mission and complete client deliverables on schedule and with high quality -Provide system change requests (SCR) batch error resolution and performance tuning support , Provide on-call support rotation , Resolve SCR production development issues ,Develop and test SCR production break/fixes and enhancements ,Establish and update technical specifications and requirements ,Escalate issues to team leads/managers in a timely manner ,Analyze and clarify SCRs with assistance
,Support SCRs day-to-day user requests ,Provide SCR post production support of SCR hotlines, and warm lines ,Some limited travel to other client sites may be required ,Will carry an on-call pager on a rotating basis ,Must submit to a Department of Defense background investigation,Preferred Skills/Qualifications:SAP ABAP experience ,Familiarity with SAP IDOC processing ,Familiality with SAP Sales and Distribution Processing and/or Materials Management Processing ,Understanding of Supply Chain processes ,Production Support experience.
336. Security Technology - SAP Security Consultant job.
Systems Integration and Technology (SI&T) offers a full range of global delivery services-from enterprise solutions, system integration, technical architectures, business intelligence, infrastructure consulting, and technology research/development.Our SI&T consultants can expect to: work with cutting edge technology ,deliver high-quality solutions across multiple industries ,work on a variety of projects ranging in both size and scope ,receive continuous training ,gain rapid career progression,Accenture Technology Consulting
(ATC) falls within the SI&T organization and provides services aimed at improving the strategic intent, transforming economics and effectiveness, and the overall design, of a company's information technology capabilities and/or functions. Information technology capabilities include IT planning and management, IT processes and approaches, and IT assets that provide the operational components necessary to run applications.Accenture's Security practice helps organizations work through complex business
and technology issues to provide a straightforward approach to information and network security. Our security professionals bring deep technology skills and industry knowledge to any organization and work closely with clients to design and implement a security solution closely tied to business objectives. Secure Business Application Services: Our Secure Business Application Services professionals provide information security design and implementation services that are unique to SAP.
We provide support as part of our Accenture SAP software implementation projects, Identity & Access Management integration projects, and SAP security outsourcing. We are currently searching for SAP Security Consultants for our Security Technology practice.Key Responsibilities may include:- Contribute to a strong client relationship through interactions with client personnel- Understand engagement as it relates to client's business.
337. SAP MidMarket Solution professional job.
The Middle Market SAP Solution Sales Professional identifies and closes Mid Market SAP (software and implementation services) opportunities. Responsibilities include identifying and qualifying SAP opportunities: independently, working within the internal sales channels, and with Partners. The SAP sales exec utilizes all Company advantages and resources and works with practice to gain client commitment to close the software and services engagement.Full Life Cycle sales from identification, though qualification and to closure.
Must be proficient in conducting a consultative sale of SAP software and implementation services including the ability to articulate and craft the implementation proposal.Demonstrating expert sales, organizational, business and professional skills, and getting the contract signed! Should have deep experience with SAP Middle Market clients, including demonstrable successful sales and delivery experience. Minimum of 1 year experience in SAP solution selling and delivery of SAP consulting services, CURRENTLY performing
these tasks in a SAP Sales/Consulting environment as their primary role. Master's degree is preferred but not required. Ideal candidate will reside in the San Francisco Bay area of California. Will consider candidates residing in the Western part of the US; with the understanding that San Francisco, CA market place experience is a MUST. Must have a history of solid SAP Solution sales with strong SAP San Francisco Bay area relationships. MUST HAVE Minimum of 1 year experience in SAP solution selling and
delivery of SAP consulting services, CURRENTLY performing these tasks in a SAP Sales/Consulting environment as their primary role. 3 years experience in identifying, qualifying and closing large/complex business and IT solution opportunities. Bachelor's Degree ,Travel M-TH/F .Be Your Best, Inc. is a preferred technical & executive recruiting agency to Fortune 100 industry leaders. Over 14 years of success. Technically knowledgable, easy to work with. Visit us on the web at www.beyourbestinc.com to see all of our current openings including 20+ security clearance openings.
338. SAP System Analyst, CUSTOMER SERVICE job.
Support and enhance SAP Customer Service system and process solutions FOR AMERICAS BUSINESS UNIT AND other business units around the world. This position will interface directly with internal end user users in the Customer Service organization including service technicians, site supervisors, office coordinators, inventory coordinators, service renewal personnel. This position will resolve SAP system issues, intake and manage system enhancement requests, and work to continuously improve the process
and SAP solution.Essential Functions:1. Understand and help resolve issues related to service order hierarchy structures and master data - work centers, resource planning board, project site structures, as-is/built equipment, maintenance plans.2. Offer expert SAP knowledge and support for the customer service functions within contract management, service orders, maintenance plans, billing plans, Mobile Asset Management, and reporting with SAP and BI.3. Maintain service level support agreements - research and
resolve internal customer SAP issues that are routed through IT support queue 4. Create and modify reports in SAP.5. Continuously improve the processes and data quality through an active dialogue with business units worldwide.6. Intake user enhancement requests, work with users to prioritize requests, analyze requirements, and then work with SAP Competency Center to configure, test and
release the new capability7. Maintain documentation using SAP Solution Manager, including business process, configuration, functional specs, test cases, and security roles
8. Work proactively and independently, and be able to keep up in fast paced environment, communicating at all levels.9. Work seamlessly with SAP Core Competency Center which will have resources in Portland, EUROPE AND ASIA.Education, Training & Skills Required: Bachelor's degree in Information technology or related field ,3+ years experience as an SAP Systems Analyst or SAP Project Consultant ,Participated in at least 1 full-life cycle SAP implementation ,SAP configuration and testing experience in PM (Plant Maintenance), SM (Service Management) MM (Materials Management), & MAM (Mobile Asset Management).
339. SAP business analyst / Data analyst
Adecco Engineering and Technical Services is seeking to find a SAP Business Analyst. This position will be with one of the Top Medical Device Companies in the Sorrento Valley Area. This company employs more than 55,000 people on six continents and produces annual revenues of more than $80 billion! It is a very stable and reputable company and they are looking to find the right fit for the position immediately! Job SummaryProvide analytical support to the Quality Assurance Management group with the overall goal of
providing information and analysis necessary to meet business objectives. This includes supporting ad hoc information requests and associated interpretation; understanding the day-to-day business and markets in which we participate; and the ability to be an effective project leader, obtain team consensus, and communicate results in both a written and presentation format. Duties & responsibilitiesDuties & responsibilities include any of the following as assigned by the manager:Perform inquiry and
analysis of SAP transactions and develops reports that will give the QA Management insight into the activities of the organization.Maintain general reporting systems to provide timely information. Serve as home office resource for ad hoc information requests relating to Quality/QA Management.Provides standard and ad hoc reports to monitor compliance responsibilities and identify potential issues.Work on business problems, analyzing and evaluating current business methods and procedures for improvement.Provides technical and analytical support in the assessment of potential applications of company products to improved customer business processes and meet customer needs.Work closely with end-users to define requirements for new or improved systems. Create specifications for systems to meet business requirements. Act as liaison between the end user department and IT for system related issues.Provide consultation to customers on process improvements and cost/benefit analysis for new proposed program modifications, methods, and procedures.
340. Implementation specialist job.
Veredus is working with an established company seeking an Implementation Specialist for the following PERMANENT opportunity:The Implementation Specialist will be responsible for end-to-end implementation of our Time & Attendance software solutions into client environments. Up to 50% travel required; Can be located anywhere but must be located near a major airport o 5-7 years implementing HR, Time & Attendance, Human Capital Management or Payroll solutions
o Prior SAP or Oracle consulting experience a pluso Bachelors Degree in Computer Science, Accounting or technical fieldo Track record of success with entrepreneurial, fast paced organizationso Demonstrated ability to manage projects under tight timelines and exacting budgetso Mastery of SQL strongly preferred; relational database knowledge a must.
o Well versed in common accounting software packages and skilled in basic report writingo Demonstrated ability to troubleshoot network permission and communication issueso Software programming/development experience a plus (C# and .NET experience preferred)o Proficient using project tracking tools such as QuickBase; Salesforce.com; skilled in Word, Excel, and MS Outlook (or similar email package).MAJOR RESPONSIBLITIESo End-to-end implementation of software and hardware solutions for multiple clients simultaneouslyo Work directly with clients to validate requirements, provide custom configurations in unique environments and lead training on the use of the systemo Develop rich understanding of client business requirements and facilitate required transition to ongoing client support teamo Cultivate internal relationships across departments in order to resolve client issues efficiently
341.Maintenance supervisor – chemical.
Urgently required Maintenance Supervisor for (SRF Limited)Company ProfileSRF has today grown into a global entity with operations in 4 countries. Apart from Technical Textiles Business, in which it enjoys a global leadership position SRF is a domestic leader in Refrigerants, Engineering Plastics and Industrial Yarns as well. The company also enjoys a significant presence among the key domestic manufacturers of Polyester Films and Fluorospecialities. Building on its in-house R&D facilities for Technical Textiles Business and Chemicals Business, the company strives to stay ahead in business through innovations in operations and product development. Website:www.srf.comPosition: Maintenance SupervisorThe candidates should have 4-6 years of experience in of mechanical/Utility maintenance in a continuous Chemical plant.The candidates will be in charge of mechanical/utility maintenance of assigned plant during the shift.The candidates should be responsible to supervise and coordinates all the maintenance activities and personnel including compliance, safe work practices, references utilization industry standard and time management process.The candidates should have good technical knowledge .The candidates should have responsible for preparation of technical proposals for purchase the various project equipments i.e. Chillers, compressor, pumps, air receiver & filters, involving thermal and mechanical calculations and costing.Execution of project like erection, leveling and alignment of Vessels, Piping, Pumps & equipments.The candidates should be mechanical diploma/degree holder.Good salary package would be provided by the company
342. electrical instrumentation mechanical engineer job.
We provide training in Industrial Automation & Electrical Switchgear.TTI is Automation and you are the future.Eligibility: BE/B.Tech-EEE/EI/EC/Mechatronics/MechanicalEXPERIENCE: 0-3 YEARS.(Students awaiting final results may also apply.)Next Batch is going to start very shortly. So researve your seat as soon as possible and get 5% discount.(You can choose any of the module.)Candidate can also apply for Summer Training and Project.TRAINING PROGRAMME-1,INDUSTRIAL AUTOMATION MODULE 1100% Job Guarantee Programme Duration: 5 Weeks + 1 Week Live Project & On Site ExposureFee: 25,000/- MODULE 2Job Assistance ProgrammeDuration: 4 Weeks + 1 Week Live Project Fee: 16,000/- MODULE 3100% Job Guarantee Programme (With Stipend)Duration: 4 Months (2 Months Class room Training + 2 Months On site Training)On Site ExposureLive Project* Stipend : 6,000/-p.m. (With On site Training)Fee: 50,000/-
MODULE 4100% Job Guarantee ProgrammeDuration: 3 Months (2 Months Class room Training + 1 Months On site Training)On Site ExposureLive Project* Stipend : 4,000/-p.m. (With On site Training)Fee: 35,000/- Course Contents: Industrial AutomationPLC: (Programmable Logic Controller)DCS (Distributed Control System ) SCADA: (Supervisory Control & Data Acquisition)AC DRIVES (ATV11, ATV21, ATV31, ATV61, ATV71)HMI : (Human Machine Interface)
Field Instrumentation,Panel Designing,Auto Cad,Networking,Wireless Communication TRAINING PROGRAMME -2ELECTRCAL SWITCHGEARMODULE 1100% Job Guarantee ProgrammeDuration: 4 Weeks + 1 Week Live Project On Site ExposureFee: 25,000/- MODULE 2Job Assistance ProgrammeDuration: 3 Weeks + 1 Week Live ProjectFee: 16,000/- MODULE 3100% Job Guarantee Programme (With Stipend)Duration: 2 Months (1 Months Class
343.maintenance technician
Ready for a maintenance job with a future? Wish you could take pride in your work, make proper repairs, and know you have real opportunities for advancement, rather than feeling stuck in a dead-end job? This may be the job for you! We're a Class-A company committed to Class-A repairs and maintenance in our high-end apartment communities. No band-aid approaches - we do things right! And we value and promote our good workers. Job Description As a key member of our team, Maintenance Technicians are responsible for keeping our properties in top-notch physical condition, inside and out. Be ready to be busy! A typical day could include plumbing, electrical, basic drywall, carpet, and appliance repairs, learning how to fix a boiler or air conditioner, plus common area and exterior maintenance and repairs. This is a full-time, steady position with regular hours and great benefits. Plus, this position offers
multiple opportunities for career advancement. RequirementsCaring attitude, pride in your work and the apartment property Self-motivated and hard-working High-energy and commitment to high performance in a busy, multi-task work environment 1 or more years of maintenance experience (construction or maintenance background, multi-family experience, or technical school ideal) HVAC certification preferred Skilled in using power tools Good people skills, friendly,
helpful attitude Rotating after-hours on-call availability, rotating weekends Good record keeping for work orders, parts orders, inventory and service/tenant follow-up Why You'd Want This Job stable, full-time work with a large, national company Great benefits: excellent health care, plus vacation and tuition reimbursement Great compensation for holidays and overtime Clear opportunities for advancement, including management opportunities, with a well-respected national company (One of America's Most Admired Companies - Fortune magazine 2004)
344. Green Home Business Opportunity
If you want more than a job change... more than a career change, if what you really desire is a meaningful life change… maybe it's time you thought about starting your very own home business. The truth is: You can find ways to do what you love, and get paid for it. You can create a life where there is balance and time for the people and things that matter most! You can help people and be successful in the process of doing something you love.
Our mission is to make available an ethical business opportunity to those who are serious about controlling their own time, income and future.
Have you been searching for a legitimate home-based business? Have a desire to work from home and spend more time with family? Want to partner with other like-minded men and women? Want to work with a company based on honesty and integrity? Want to partner with a company that cares for people and the environment?
If you answered yes to any of these questions then take a look at our life changing, Christian home business opportunity!Our company is looking for motivated people. If you want more than a job change...more than a career change, if what you really desire is a meaningful life change…maybe it's time you thought about starting your very own home-business with our Christian company. If you’re someone who is interested in the getting out of a 9-5 daily grind job AND passionately
committed to your success, then this opportunity is for you! Stop spending your life pursuing someone else's dreams and fortunes, and start going after YOUR OWN. Take control of your financial future, and make a great living from the comfort of your own home with our Christian Business. The truth is: You can find ways to do what you love, and get paid for it. You can create a life where there is balance and time for the people and things that matter most
344 Director- home based services. Job
Duties & Responsibilities: The Georgia Department of Human Resources, Division of Family and Children Services (DFCS), is seeking qualified candidates for the position of Director, Home Based Services Unit (HBSU). The Division of Family and Children Services (DFCS) is the part of DHR that investigates child abuse; finds foster homes for abused and neglected children; helps low income, out-of-work parents get back on their feet; assists with childcare costs for low income
parents who are working or in job training; and provides numerous support services and innovative programs to help troubled families. Under broad supervision of the Director, Provider Utilization and Outcomes Management the HBSU Director is responsible for the development and monitoring of all contracts related to the provision of home-based services. Performance monitoring will include, but is not limited to, oversight of service deliverables outlined in the providers contract and
specified performance-based outcomes measures. This position requires a high level of clinical acumen particularly in the community-based milieu. Ability to use various clinical practice models to directly improve CFSR performance indicators is essential. Additionally, the Director/HBSU will direct the work of assigned professional and support staff; as well as participate in internal and external committees and task forces. AGENCY SPECIFIC QUALIFICATIONS:Our ideal
candidate will have a Bachelors degree in a Behavioral Science and five years experience in a human services delivery program, of which 3 years are in a supervisory position. or A Masters degree in a Behavioral Science with a minimum of 3 years experience in a human services delivery program of which 3 years are in a supervisory position. Additionally the candidate should have (and if invited to interview, be prepared to discuss) one or more of the following: - Clinical licensure (desired, but not required) - Experience supervising supervisors - Expertise in home-based service delivery models - Ability to conduct research and fill in service gaps - Knowledge of DHR or government contract documents and processes
345. Mobile communication manager.
The Sears Smart Toolbox (SST) is the current laptop computer that is used by over 8,500 ServiceTechnicians in the field. The Mobile Communications Manager is responsible for National oversight of inventory levels of the SST and related assets, Vendor Management associated with the SST, In-Homefield support of any SST related issues, creation of SST related compliance reporting, and subject matterexpert for SST related business processes.
Responsibilities/Skills/Experience Requirements
• Leads continuous improvement processes for the entire SST functionality and wirelesscommunication technology through initiatives that improve the functionality and experience of theSST and related assets.• Collaborates with business managers and acts as the subject matter expert for SST related topicsincluding new projects/initiatives• Acts as the single point of contact for resolution of national system issues, system inquires, andimprovement suggestions for SST related enhancements.
• Collaborates with IT, Legal, Help Desk, Public Relations, and other areas of Sears HoldingCorporation to monitor and comply with all business requirements, rules, and regulations.• Supports Home Services business partners on SST programs, projects, and enhancements.• Develops and executes crisis management/contingency planning for SST and wireless technologyoutages and provides resolution through root cause analyses and by facilitating systemic solutions
• Identify opportunities for continuous improvement in business activities including field trainingmaterials, QMS documentation, user acceptance, through the transferring of information andknowledge to the business by the use of SST and wireless technology.• Develops and communicate field/business process compliance reporting.• Develops and executes detailed communication plans associated with the rollout of hardwareenhancements and technology upgrades for SST and other mobile communication devices.
346. home based accountatnt job
Description A well established CA agriculture company that is family owned and operated is seeking an Accountant/Controller. This is a newly created position and an excellent opportunity to join a dynamic organization. The ideal candidate will need to be based anywhere in CALIFORNIA or TEXAS. This position will be responsible for: · International auditing Mexico/California, strategic and managerial accounting at all levels. · Assist in preparation and coordination of audits
and reviews in CA and MX. · Coordinates and administers an adequate plan for control of operations. Such plan provides expense budgets, accounts receivable, cost standards and break-even analysis. · Compares performance with operating plans and standards and analyzes variances. Provides reports and interprets the results of operations to all levels of management. This includes preparation of operational commentaries, financial statements and operating data
as well as special reports as may be required. · Development, analysis and interpretation of statistical and per unit information in order to appraise operating results in terms of performance against budget standards. · Assist in preparation, coordination and review of monthly consolidated financial statements. · Analyzes variances from budget standards with input from Mexico partner operation managers. Prepare reports for CFO and recommend changes in budget standards as warranted.
Competitive Salary and Benefit Package. Bonus. Relocation assistance if needed. Advanced degree preferred- MBA,CPA · Bilingual- Spanish/English preferred · Minimum 5 years accounting/auditing experience · Audit and cost analysis experience · Strong verbal and written communication skills · Professional, independent, decisive, optimistic, and focused.
Computer skills: Microsoft Office, strong in Excel
347. home based treatment service job.
THIS IS A PART-TIME/FLEX POSITION WORKING WITH CHILD ONE ON ONE IN VARIOUS AREAS SUCH AS LINCOLN, MAPLEVILLE, SMITHFIELD, BURRIVILLE, JOHNSTON, PROVIDENCE AND CHEPACHET. ***THIS IS A PART TIME/ TEMPORARY POSITION***JOB DESCRIPTION: Provides direct behavioral therapy and family skills training by Implementing the H.B.T.S. plan approved by The Department of Human Services.
DUTIES INCLUDE: Provides direct behavioral therapy to individuals ages 3 to 21; provides role modeling/training to parents/families; provides support to child and family, when necessary, with appointments, therapy sessions, etc.; provides assistance to the child with self-help & self-care; provides teaching of social & safety skills when applicable; provides data/documentation to H.B.T.S. coordinator as required; provides information to psychologist as
necessary; reports concerns/safety hazards; promotes individualized, person-centered, service delivery which meets all identified needs of the child & family; performs other duties as assignedOur company is looking for motivated people. If you want more than a job change...more than a career change, if what you really desire is a meaningful life change…maybe it's time you thought about starting your very own home-business with our Christian company.
If you’re someone who is interested in the getting out of a 9-5 daily grind job AND passionately committed to your success, then this opportunity is for you! Stop spending your life pursuing someone else's dreams and fortunes, and start going after YOUR OWN. The truth is: You can find ways to do what you love, and get paid for it. You can create a life where there is balance and time for the people and things that matter most.
348. web analyst.
Description:Responsible for identifying opportunities to utilize consumer information to positively impact profit contribution, analyzing and reporting Interactive consumer metrics, translating information into actionable recommendations.Responsibilities:• Define, analyze and manage Interactive customer metrics in order to develop deep understanding of Interactive user base.• Analyze Interactive user metrics, including site visit, interaction, penetration, areas of interest and
other metrics, in order to develop a deep understanding and profile of SPC Digital’s Interactive customer segments.• Communicate Interactive metrics and information in weekly and monthly reports to Interactive and Executive team.• Serve as the key interpreter of data, translating vast amounts of data into actionable information and recommendations for users.• Proactively present findings, recommendations and opportunities to team.• Develop, reconcile and distribute site reports ensuring complete accuracy.
• Analyze and track trends, site tests and other initiatives as needed.• Ensure that all stakeholders’ informational needs are being met, and provide necessary training and support.• Support quantitative consumer research studies ensuring timeliness and accuracy.• Analyze demographic, competitive and macroeconomic trends to identify opportunities for SPC Digital Interactive and corporate business opportunities.• Any other duties as assigned.Overall Requirements:Candidates should have:• BS / BA preferably in an analytical field. • 3 or more yrs. related analysis experience, preferably with experience with Interactive analysis. Those with experience as a Buyer are strongly encouraged to apply.• Superior analytic and problem-solving skills• Highly proactive, inquisitive and enthusiastic• Excellent communication and presentation skills• Knowledge of accounting techniques, budgeting, forecasting and planning. • Strong administrative and organizational skills
349. Counter Parts – Joliet job
NAPA Auto Parts is a service organization and recognized industry leader in the distribution and sales of automotive replacement parts and supplies. Genuine Parts Company, founded in 1928 and the parent company of NAPA is a Fortune 500 Company. It's business segments include (NAPA), industrial replacement parts (Motion Industries), office products (SP Richards), and electrical and electronic components (EIS). The Company serves numerous customers from more than 1,800 operations and has approximately 30,800 employees.NAPA 's company-owned stores and our independent owner-operated stores are exciting in that everything we are as a company comes together in the stores. Purchasing, distribution, marketing, merchandising, customer service; all play a major role each day. One of the unique strengths of the NAPA system is the combination of national buying power, training resources, and distribution muscle with local ownership and commitment. The NAPA Spirit is one reason why more people choose NAPA than any other parts supplier.We want dedicated, energetic and driven people to promote the entire NAPA line of products and services to increase market penetration. We are looking for knowledgeable and energetic people who want to sell automotive parts. Are you ready for OPPORTUNITIES to grow your career?Do you need FLEXIBITY in your schedule?Want to have FUN in your job?Looking for a
GREAT TEAM to work with?Are you ready to sell The Good Stuff?Join the winning team at your local NAPA Store: Assisting customers with their auto parts questions and needs (either in-person or via telephone).Operating a cash register, computer and paper cataloging systems and processing daily paperwork and forms.Effectively communicating features, benefits, and warranty policy information to customers.Demonstrate a positive, helpful attitude as well as presenting professional conduct and appearance at all times.350. Part time Merchandiser
Work hours are up to an average of 28 hours per week on a regular basis.The principal duties and responsibilities are as follows:* Locate and transfer cases of product from the store's backroom to the Kellogg Snacks shelves and displays* Stock Kellogg Snacks displays and shelves using proper rotation techniques (oldest in front to freshest in the back); remove any stale or damaged packages from shelves or displays and place in designated store location; "Face" products* Follow the stores's procedures to dispose of all empty cardboard cases and paper wrap
* Communicate issues to the Territory Manager or District Manager as appropriate* Maintain itinerary as stores are added/deleted or the sequence is changed.* Report daily hours worked via the defined system and forward a written timesheet to the District Manager at the end of each week* Maintain a professional demeanor with the public and with store personnel when performing duties* Demonstrate safe work practices The work environment demands the following:
*Work schedule which may include nights, weekends and early morning hours*Position requires daily travel within the territory and occasional travel outside of territory*Frequent public contact; must maintain a professional demeanor at all times.Position Requirements * In order to be considered for this position you must be accepting of the following terms:* You will be required to have your own transportation to and from the work-site.
* Weekly access to the Internet is necessary.* Mileage is not reimbursed.* If selected, you will be required to successfully complete a drug screen and background check.* The average hourly range is between $8.00 - $13.00 per hour dependent upon experience and geography. With 2008 sales of nearly $13 billion, Kellogg Company (NYSE:K) is the world’s leading producer of cereal and a leading producer of convenience foods, including cookies, crackers, toaster pastries, cereal bars, frozen waffles, and meat alternatives.
Friday, June 26, 2009
Subscribe to:
Posts (Atom)