Professional writing

>> Saturday, September 26, 2009

Professional Writers

A professional writer is someone who has been paid for work that they have written.

Professional writing is/as rhetorical

Professional writing is connected to the concept of rhetoric. Rhetoric focuses on informing or persuading an audience, and a successful professional writer is able to create interest in their audience. Moreover, this is combined with the aspects of the professional world which is typically done within a professional atmosphere, be it a workplace or as freelance work, created by someone who has knowledge and skills at writing and comprehends the wide range of requirements needed to successfully create the pieces being composed. Doing a google.com search will yield many professional writers.
One of the main principles of rhetoric, when applied to the work of professional writers, is the art of effective communication and creating authoritative arguments.

Professional writing in other fields

Even if a student does not plan on writing as their career, they must still prepare for the inevitable writing for their career. Any professional field will require some form of writing, therefore a background in professional writing will never be wasted. Some instances where professional writing may be used in other career fields include the following examples:
Law
- Case studies -Briefs - Client correspondence
Science and Engineering
- Lab reports - Journal articles - Technical reports - Experimental procedures and logs - Grant proposals
Retail
- Advertisement - Marketing analyses - Inventory reports - Damage reports
Entertainment
- Recording contracts - Project proposals - Reviews - Website authoring
Nearly all professions require written documents; in other words,all staff members produce written documentation of their work. Whether it be a fast food chain looking for additional web development, a law firm editing legal documents, or perhaps even a musical venue looking for someone to create flyers and posters on a regular basis; a professional writer can fit into any organization. Professional writers are specifically valuable in the workplace for their many talents including communication skills, creativity, technological proficiency, and other social skills.

Professional writing as compared to other majors

Professional writing, particularly as an undergraduate major, is most often confused with English and/or Journalism due to their similar skill groupings and classes.
English courses often include classes in professional writing and professional composition, emphasizing a clear and technical approach to writing. However, the majors begin to differ in that English has a larger focus on the reading and analysis of literature. Traditionally as well, writing within an English major revolves around the creation of essays and critiques, besides creative writing such as poetry and fiction.
Journalism, while retaining the conciseness that is characteristic to most professional writing documents, tends to produce short and fact-based articles rather than the more in-depth reports within professional writing.
Professional writers tend to have more specific and varied audiences with a focus more specific than facts alone. Professional writing involves advanced writing skills with an emphasis on writing in digital environments (e.g., web authoring, multimedia writing), evaluating rhetorical techniques to tailor writing to specific audiences, and requires proficiency in writing in a professional atmosphere such as the workplace for a company or professional organizations.

Read more...

Content management

Content management, or CM, is a set of processes and technologies that support the evolutionary life cycle of digital information. This digital information is often referred to as content or, to be precise, digital content. Digital content may take the form of text, such as documents, multimedia files, such as audio or video files, or any other file type which follows a content lifecycle which requires management.
As of May 2009, the world's digital content is estimated at 487 billion gigabytes, the equivalent of a stack of books stretching from Earth to Pluto ten times.

The process of content management

Content management practices and goals vary with mission. News organizations, e-commerce websites, and educational institutions all use content management, but in different ways. This leads to differences in terminology and in the names and number of steps in the process. Typically, though, the digital content life cycle consists of 6 primary phases:
create;
update;
publish;
translate;
archive and;
retrieve.
For example, an instance of digital content is created by one or more authors. Over time that content may be edited. One or more individuals may provide some editorial oversight thereby approving the content for publication. Publishing may take many forms. Publishing may be the act of pushing content out to others, or simply granting digital access rights to certain content to a particular person or group of persons. Later that content may be superseded by another form of content and thus retired or removed from use.
Content management is an inherently collaborative process. It often consists of the following basic roles and responsibilities:
Creator - responsible for creating and editing content.
Editor - responsible for tuning the content message and the style of delivery, including translation and localization.
Publisher - responsible for releasing the content for use.
Administrator - responsible for managing access permissions to folders and files, usually accomplished by assigning access rights to user groups or roles. Admins may also assist and support users in various ways.
Consumer, viewer or guest- the person who reads or otherwise takes in content after it is published or shared.
A critical aspect of content management is the ability to manage versions of content as it evolves (see also version control). Authors and editors often need to restore older versions of edited products due to a process failure or an undesirable series of edits.
Another equally important aspect of content management involves the creation, maintenance, and application of review standards. Each member of the content creation and review process has a unique role and set of responsibilities in the development and/or publication of the content. Each review team member requires clear and concise review standards which must be maintained on an ongoing basis to ensure the long-term consistency and health of the knowledge base.
A content management system is a set of automated processes that may support the following features:
Import and creation of documents and multimedia material
Identification of all key users and their roles
The ability to assign roles and responsibilities to different instances of content categories or types.
Definition of workflow tasks often coupled with messaging so that content managers are alerted to changes in content.
The ability to track and manage multiple versions of a single instance of content.
The ability to publish the content to a repository to support access to the content. Increasingly, the repository is an inherent part of the system, and incorporates enterprise search and retrieval.
Content management systems take the following forms:
a web content management system is software for web site management - which is often what is implicitly meant by this term
the work of a newspaper editorial staff organization
a workflow for article publication
a document management system
a single source content management system - where content is stored in chunks within a relational database

Implementations

Content management implementation must be able to manage content distributions and digital rights in content life cycle. Content management systems are usually involved with Digital Rights Management Systems to be able to control user access and digital right. In this step the read only structures of Digital Rights Management Systems force some limitations on Content Management implementations as they do not allow the protected contents to be changed in their life cycle. Creation of new contents using the managed(protected) ones is also another issue which will get the protected contents out of management controlling systems. There are a few Content Management implementations covering all these issues.

Read more...

Content development (web)

Web content development is the process of researching, writing, gathering, organizing, and editing information for publication on web sites. Web site content may consist of prose, graphics, pictures, recordings, movies or other media assets that could be distributed by a hypertext transfer protocol server, and viewed by a web browser.

Content developers and web developers

When the World Wide Web began, web developers either generated content themselves, or took existing documents and coded them into hypertext markup language (HTML). In time, the field of web site development came to encompass many technologies, so it became difficult for web site developers to maintain so many different skills. Content developers are specialized web site developers who have mastered content generation skills. They can integrate content into new or existing web sites, but they may not have skills such as script language programming, database programming, graphic design and copywriting.
Content developers may also be search engine optimization specialists, or Internet marketing professionals. This is because content is called 'king'. High quality, unique content is what search engines are looking for and content development specialists therefore have a very important role to play in the search engine optimization process. One issue currently plaguing the world of web content development is keyword-stuffed content which are prepared solely for the purpose of manipulating a search engine. This is giving a bad name to genuine web content writing professionals. The effect is writing content designed to appeal to machines (algorithms) rather than people or community. Search engine optimization specialists commonly submit content to Article Directories to build their website's authority on any given topic. Most Article Directories allow visitors to republish submitted content with the agreement that all links are maintained. This has become a method of Search Engine Optimization for many websites today. If written according to SEO copywriting rules, the submitted content will bring benefits to the publisher (free SEO-friendly content for a webpage) as well as to the author (a hyperlink pointing to his/her website, placed on an SEO-friendly webpage).

Read more...

Content designer

A content designer is a designer who designs content for media or software. The term is mainly used in web development. Depending on the content format, the content designer usually holds a more specific title such as graphic designer for graphical content, writer for textual content, instructional designer for educational content, or a programmer for automated program/data-driven content.

A senior content designer is a designer who leads a "content design" group in designing new content for a product. Depending on the purpose of the content, the role of a senior content designer may be similar or identical to a communication design, game development or educational role with a different title more associated with those professions. For example: a senior content designer in a communication design profession is better known as a creative director.

Read more...

Content adaptation

Content Adaptation is the action of transforming content to adapt to device capabilities. Content adaptation is usually related to mobile devices that require special handling because of their limited computational power, small screen size and constrained keyboard functionality.
Content adaptation could roughly be divided to two fields: Media content adaptation that adapts media files and browsing content adaptation that adapts Web site to mobile devices.

Browsing Content Adaptation

Advances in the capabilities of small, mobile devices, such as mobile phones (cell phones) and Personal Digital Assistants has led to an explosion in the number of types of device that can now access the Web. Some commentators refer to the Web that can be accessed from mobile devices as the Mobile Web.
The sheer number and variety of Web-enabled devices poses significant challenges for authors of Web sites who want to support access from mobile devices. The W3C Device Independence Working Group described many of the issues in its report Authoring Challenges for Device Independence.
One approach to solving the problem is based around the concept of Content Adaptation. Rather than requiring authors to create pages explicitly for each type of device that might request them, content adaptation transforms an author's materials automatically.
For example, content might be converted from a device-independent markup language, such as XDIME, an implementation of the W3C's DIAL specification, into a form suitable for the device, such as XHTML Basic, C-HTML or WML. Similarly a suitable device-specific CSS style sheet or a set of in-line styles might be generated from abstract style definitions. Likewise a device specific layout might be generated from abstract layout definitions.
Once created, the device-specific materials form the response returned to the device from which the request was made.
Content adaptation requires a processor that performs the selection, modification and generation of materials to form the device-specific result. IBM's Websphere Everyplace Mobile Portal (WEMP), BEA Systems' WebLogic Mobility Server, Morfeo's MyMobileWeb and Apache Cocoon are examples of such processors.
Wurfl and WALL are popular Open Source tools for content adaptation. WURFL is an XML-based Device Description Repository with APIs to access the data in Java and PHP (and other popular programming languages). WALL (Wireless Abstraction Library) lets a developer author mobile pages that look like plain HTML, but converts them to WML, C-HTML and XHTML Mobile Profile depending on the capabilities of the device from which the HTTP request originates.
Alembik (Media Transcoding Server) is a Java (J2EE) application providing transcoding services for variety of clients and for different media types (image, audio, video, etc). It is fully compliant with OMA's Standard Transcoder Interface specification and is distributed under the LGPL open source license.
Launched in 2007, Bytemobile’s Web Fidelity Service was the first carrier-grade, commercial infrastructure solution to provide wireless content adaptation to mobile subscribers on their existing mass-market handsets, with no client download required.

Read more...

Cloud networking

Cloud networking is the interconnection of components to "meet the networking requirements inherent in cloud computing". Cloud networking allows users to "tap a vast network of computers that can be accessed from long distance by a cell phone, laptop or mobile device for information or data".

Legal issues

U.S. Trademark 77,596,599 Arastra, Inc. (aka Arista) applied to the USPTO to trademark the descriptive and generic term on 20 October 2008 on a schedule 1(b) basis (intent to use) for Networking hardware and software to interconnect computers, servers and storage devices; computer software for use in controlling the operation and management of networks; computer software for use in connecting computer networks and systems, servers and storage devices; instructional manuals sold as a unit therewith despite extensive prior use of the term by other companies like Asankya ("the leader in Cloud networking services"), the existence of various solutions in the space already and generic use by the pressand bloggers.
This trademark status was listed as abandened 3 February 2009, perhaps due to a similar loss of Dell from their attempt at "Cloud Computing" being too generic.

Read more...

Cloud Computing Manifesto

The Cloud Computing Manifesto is a manifesto containing a "public declaration of principles and intentions" for cloud computing providers and vendors, annotated as "a call to action for the worldwide cloud community" and "dedicated belief that the cloud should be open". It follows the earlier development of the Cloud Computing Bill of Rights which addresses similar issues from the users' point of view.
The document was developed "by way of an open community consensus process" in response to a request by Microsoft that "any 'manifesto' should be created, from its inception, through an open mechanism like a Wiki, for public debate and comment, all available through a Creative Commons license".Accordingly it is hosted on a MediaWiki wiki and licensed under the CC-BY-SA 3.0 license.
The original, controversial version of the document called the Open Cloud Manifesto was sharply criticised by Microsoft who "spoke out vehemently against it" for being developed in secret by a "shadowy group of IT industry companies", raising questions about conflicts of interest and resulting in extensive media coverage over the following days. A pre-announcement commits to the official publication of this document on March 30, 2009 (in spite of calls to publish it earlier), at which time the identities of the signatories ("several of the largest technology companies and organizations" led by IBM along with OMGand believed also to include Cisco, HP, and Sun Microsystems) is said to be revealed. Amazon, Google, Microsoft and Salesforce.com are among those known to have rejected the document by declining to be signatories. The document was leaked by Geva Perry in a blog post on 27 March 2009and confirmed to be authentic shortly afterwards.
The authors of both public and private documents have agreed to "work to bring together the best points of each effort".
Controversy
The Open Cloud Manifesto version, developed in private by a secret consortium of companies, was prematurely revealed by Microsoft's Senior Director of Developer Platform Product Management, Steve Martin on 26 March 2009. They claim that they were "privately shown a copy of the document, warned that it was a secret, and told that it must be signed 'as is,' without modifications or additional input", a point which is disputed by Reuven Cohen (originally believed to be the document's author). Some commentators found it ironic that Microsoft should speak out in support of open standardswhile others felt that their criticism was justified, comparing it to the "long, ugly war over WS-I".The call for open cloud standards was later echoed by Brandon Watson, Microsoft's Director of Cloud Services Ecosystem.
Principles
The following principles are defined by the document:
1. User centric systems enrich the lives of individuals, education, communication, collaboration, business, entertainment and society as a whole; the end user is the primary stakeholder in cloud computing.
2. Philanthropic initiatives can greatly increase the well-being of mankind; they should be enabled or enhanced by cloud computing where possible.
3. Openness of standards, systems and software empowers and protects users; existing standards should be adopted where possible for the benefit of all stakeholders.
4. Transparency fosters trust and accountability; decisions should be open to public collaboration and scrutiny and never be made "behind closed doors".
5. Interoperability ensures effectiveness of cloud computing as a public resource; systems must be interoperable over a minimal set of community defined standards and vendor lock-in must be avoided.
6. Representation of all stakeholders is essential; interoperability and standards efforts should not be dominated by vendor(s).
7. Discrimination against any party for any reason is unacceptable; barriers to entry must be minimised.
8. Evolution is an ongoing process in an immature market; standards may take some time to develop and coalesce but activities should be coordinated and collaborative.
9. Balance of commercial and consumer interests is paramount; if in doubt consumer interests prevail.
10. Security is fundamental, not optional.

Read more...

Cloud computing

Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users need not have knowledge of, expertise in, or control over the technology infrastructure in the "cloud" that supports them.
The concept generally incorporates combinations of the following:
infrastructure as a service (IaaS)
platform as a service (PaaS)
software as a service (SaaS)
Other recent (ca. 2007–09) technologies that rely on the Internet to satisfy the computing needs of users. Cloud computing services often provide common business applications online that are accessed from a web browser, while the software and data are stored on the servers.
The term cloud is used as a metaphor for the Internet, based on how the Internet is depicted in computer network diagrams and is an abstraction for the complex infrastructure it conceals.
The first academic use of this term appears to be by Prof. Ramnath K. Chellappa (currently at Goizueta Business School, Emory University) who originally defined it as a computing paradigm where the boundaries of computing will be determined by economic rationale rather than technical limits.

Characteristics

Cloud computing customers do not generally own the physical infrastructure serving as host to the software platform in question. Instead, they avoid capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Many cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services (such as electricity) are consumed, while others bill on a subscription basis. Sharing "perishable and intangible" computing power among multiple tenants can improve utilization rates, as servers are not unnecessarily left idle (which can reduce costs significantly while increasing the speed of application development). A side effect of this approach is that overall computer usage rises dramatically, as customers do not have to engineer for peak load limits. Additionally, "increased high-speed bandwidth" makes it possible to receive the same response times from centralized infrastructure at other sites.

Economics

Cloud computing users can avoid capital expenditure (CapEx) on hardware, software, and services when they pay a provider only for what they use. Consumption is usually billed on a utility (e.g. resources consumed, like electricity) or subscription (e.g. time based, like a newspaper) basis with little or no upfront cost. A few cloud providers are now beginning to offer the service for a flat monthly fee as opposed to on a utility billing basis. Other benefits of this time sharing style approach are low barriers to entry, shared infrastructure and costs, low management overhead, and immediate access to a broad range of applications. Users can generally terminate the contract at any time (thereby avoiding return on investment risk and uncertainty) and the services are often covered by service level agreements (SLAs) with financial penalties.
According to Nicholas Carr, the strategic importance of information technology is diminishing as it becomes standardized and less expensive. He argues that the cloud computing paradigm shift is similar to the displacement of electricity generators by electricity grids early in the 20th century.
Although companies might be able to save on upfront capital expenditures, they might not save much and might actually pay more for operating expenses. In situations where the capital expense would be relatively small, or where the organization has more flexibility in their capital budget than their operating budget, the cloud model might not make great fiscal sense. Other factors impacting the scale of any potential cost savings include the efficiency of a company’s data center as compared to the cloud vendor’s, the company’s existing operating costs, the level of adoption of cloud computing, and the type of functionality being hosted in the cloud.

Companies

Vmware, Sun Microsystems, Rackspace US, IBM, Amazon, Google, BMC, Microsoft, and Yahoo are some of the major cloud computing service providers. Cloud services are also being adopted by individual users through large enterprises including Vmware, General Electric, and Procter & Gamble
As of 2009, new players, such as Ubuntu Cloud Computing, are gaining attention in the industry
Architecture
The majority of cloud computing infrastructure, as of 2009, consists of reliable services delivered through data centers and built on servers with different levels of virtualization technologies. The services are accessible anywhere that provides access to networking infrastructure. Clouds often appear as single points of access for all consumers' computing needs. Commercial offerings are generally expected to meet quality of service (QoS) requirements of customers and typically offer SLAs. Open standards are critical to the growth of cloud computing, and open source software has provided the foundation for many cloud computing implementations.

Criticism and Disadvantages of Cloud Computing

Because cloud computing does not allow users to physically possess the storage of their data (the exception being the possibility that data can be backed up to a user-owned storage device, such as a USB flash drive or hard disk) it does leave responsibility of data storage and control in the hands of the provider.
Cloud computing has been criticized for limiting the freedom of users and making them dependent on the cloud computing provider, and some critics have alleged that is only possible to use applications or services that the provider is willing to offer. Thus, The London Times compares cloud computing to centralized systems of the 1950s and 60s, by which users connected through "dumb" terminals to mainframe computers. Typically, users had no freedom to install new applications and needed approval from administrators to achieve certain tasks. Overall, it limited both freedom and creativity. The Times argues that cloud computing is a regression to that time.
Similarly, Richard Stallman, founder of the Free Software Foundation, believes that cloud computing endangers liberties because users sacrifice their privacy and personal data to a third party. He stated that cloud computing is "simply a trap aimed at forcing more people to buy into locked, proprietary systems that would cost them more and more over time."
Further to Stallman's observation, It would be a challenge for hosting/deploying intranet and access restricted (for Govt., defense, institutional, etc) sites and their maintenance. Commercial sites using tools such as web analytics may not be able to capture right data for their business planning etc.

Risk mitigation

Corporations or end-users wishing to avoid not being able to access their data — or even losing it — are typically advised to research vendors' policies on data security before using their services. One technology analyst and consulting firm, Gartner, lists several security issues that one should discuss with cloud-computing vendors:
Privileged user access—Who has specialized access to data and about the hiring and management of such administrators?
Regulatory compliance—Is the vendor willing to undergo external audits and/or security certifications?
Data location—Does the provider allow for any control over the location of data?
Data segregation—Is encryption available at all stages, and were these encryption schemes designed and tested by experienced professionals?
Recovery—What happens to data in the case of a disaster, and does the vendor offer complete restoration, and, if so, how long does that process take?
Investigative Support—Does the vendor have the ability to investigate any inappropriate or illegal activity?
Long-term viability—What happens to data if the company goes out of business, and is data returned and in what format?
Data availability—Can the vendor move your data onto a different environment should the existing environment become compromised or unavailable?
In practice, one can best determine data-recovery capabilities by experiment; for example, by asking to get back old data, seeing how long it takes, and verifying that the checksums match the original data. Determining data security can be more difficult, but one approach is to encrypt the data yourself. If you encrypt data using a trusted algorithm, then, regardless of the service provider's security and encryption policies, the data will only be accessible with the decryption keys. This leads, however, to the problem of managing private keys in a pay-on-demand computing infrastructure.

Key characteristics

Agility improves with users able to rapidly and inexpensively re-provision technological infrastructure resources. The cost of overall computing is unchanged, however, and the providers will merely absorb up-front costs and spread costs over a longer period.
Cost is claimed to be greatly reduced and capital expenditure is converted to operational expenditure. This ostensibly lowers barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house). Some would argue that given the low cost of computing resources, that the IT burden merely shifts the cost from in-house to outsourced providers. Furthermore, any cost reduction benefit must be weighed against a corresponding loss of control, access and security risks.
Device and location independenceenable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.
Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
Peak-load capacity increases (users need not engineer for highest possible load-levels)
Utilization and efficiency improvements for systems that are often only 10–20% utilized.
Reliability improves through the use of multiple redundant sites, which makes cloud computing suitable for business continuity and disaster recovery. Nonetheless, many major cloud computing services have suffered outages, and IT and business managers can at times do little when they are affected.
Scalability via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads. Performance is monitored, and consistent and loosely-coupled architectures are constructed using web services as the system interface.
Security typically improves due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data. Security is often as good as or better than under traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. Providers typically log accesses, but accessing the audit logs themselves can be difficult or impossible. Ownership, control and access to data controlled by "cloud" providers may be made more difficult, just as it is sometimes difficult to gain access to "live" support with current utilities. Under the cloud paradigm, management of sensitive data is placed in the hands of cloud providers and third parties.
Sustainability comes about through improved resource utilization, more efficient systems, and carbon neutrality. Nonetheless, computers and associated infrastructure are major consumers of energy. A given (server-based) computing task will use X amount of energy whether it is on-site, or off.

Application

A cloud application leverages the Cloud in software architecture, often eliminating the need to install and run the application on the customer's own computer, thus alleviating the burden of software maintenance, ongoing operation, and support. For example:
Peer-to-peer / volunteer computing (Bittorrent, BOINC Projects, Skype)
Web application (Facebook)
Software as a service (Google Apps, SAP and Salesforce)
Software plus services (Microsoft Online Services)

Infrastructure

Cloud infrastructure, such as Infrastructure as a service, is the delivery of computer infrastructure, typically a platform virtualization environment, as a service. For example:
Full virtualization (GoGrid, Skytap, iland)
Grid computing (Sun Cloud)
Management (RightScale)
Compute (Amazon Elastic Compute Cloud)
Platform (Force.com)
Storage (Amazon S3, Nirvanix, Rackspace)

Platform

A cloud platform, such as Platform as a service, the delivery of a computing platform, and/or solution stack as a service, facilitates deployment of applications without the cost and complexity of buying and managing the underlying hardware and software layers.[58] For example:
Code Based Web Application Frameworks
Java Google Web Toolkit (Google App Engine)
Python Django (Google App Engine)
Ruby on Rails (Heroku)
.NET (Azure Services Platform)
Non-Code Based Web Application Framework
WorkXpress
Cloud Computing Application & Web Hosting (Rackspace Cloud)
Proprietary (Force.com)

User

A user is a consumer of cloud computing. The privacy of users in cloud computing has become of increasing concern. The rights of users is also an issue, which is being addressed via a community effort to create a bill of rights. The Franklin Street statement was drafted with an eye towards protecting users' freedoms.

Read more...

  © Blogger templates Palm by Ourblogtemplates.com 2008

Back to TOP