2013年2月25日 星期一
2012 Glorious Hope - Soli Deo Gloria: Mysterious shaking puzzles experts in Morristown, ...
2012 Glorious Hope - Soli Deo Gloria: Mysterious shaking puzzles experts in Morristown, ...: February 15, 2013 – MORRISTOWN, TN - Late Tuesday afternoon dozens of concerned neighbors called Hamblen County 911 to report the earth...
2013年2月18日 星期一
Extreme Weather Fluctuations as the Climate Reacts to Geoengineering
by Dane
Wigington
What’s Wrong With The Weather?
Global geoengineering/weather modification programs are completely disrupting the planets natural weather patterns from top to bottom. The entire climate system is so totally out of balance at this point that it is swinging radically from one extreme to another. These massive fluctuations are being “forced” by the global climate/weather modification programs known as “solar radiation management” (SRM) and “stratospheric aerosol geoengineering” (SAG).
Adequate Precipitation, Or Colder Temperatures? Now, More Often Than Not, You Can’t Have Both, Thanks To Geoengineering
If one takes the time to examine the first NOAA map below, (temperature forecast) even without any meteorological experience, it is easy enough to recognize that there is extreme contrast. Temperature gradients should tear more from north to south, not from west to east.
The NOAA projected temperature map below is for the period from 2/10/13 to 2/14/13. The oranges to red colors with the letter “A” indicate “above” normal temperatures. The darker the color, the further above normal the temperatures are predicted to be. In this case, the darkest shaded areas would indicate a prediction in the 15 to 20 degree above “normal” range.
Toward the west/southwest US, a range of blue coloration (with the letter B for “below” normal) reflects far below average temperatures. In the darkest shaded areas these maps predict something in the 15 to 20 degree below average range.
The second map below reflects rainfall “predictions” (more accurately considered “scheduled” weather as virtually all the “weather modeling” maps for NOAA are now done by defense industry contractors like Raytheon, the same contractors conducting the geoengineering programs). The second map is for the same period as the first map. Again, areas with the “A” indicates a “prediction” of above normal rainfall. The darker the color, the further above normal. “B” is below normal rainfall. The darker the shaded area, the dryer it is “predicted” (scheduled) to be.
So How Is Geoengineering Effecting The “Forecast” Maps And The Weather?
Here is the important consideration between these maps, in general, the further above average the temperatures are, the more precipitation there will be. The lower the temps, the less precipitation there will be. At first glance this could seem straightforward enough. After all, the atmosphere does hold 7% more moisture for every degree of temperature rise, but that is not the full story any longer. The atmosphere is being completely saturated with toxic, reflective, desiccating geoengineering particulates, and the jet stream increasingly appears to be consistently manipulated with ionosphere heater installations. (See HAARP manipulates jet stream)
The more the geoengineers spray to try and cool down the temperatures, the less it will rain, period. (The science is clear on this. Google “geoengineering reduces rainfall”, there are simply too many studies on this to link only one.) Add “artificial ice nucleation” to the spray mix and the precipitation appears to go down still further. “Artificial ice nucleation” is a chemical process that can produce colder weather/cloud temperatures and snow out of what should have been a rain storm at well above freezing temperatures. If the temps are already cold enough for snow, this same process can lower the temperatures even further, but at a cost, not much snow compared to historical norms.
There are, of course, exceptions to this when a very moisture-laden storm system is ice nucleated, but the “snow” from such system “conversions” is “heavy wet snow”. This recent term coined by the Weather Channel describes the concrete-like “snow” that sticks like glue to trees and causes utter decimation to the forests. Broken and tipped over trees are everywhere in the Pacific Northwest from just such an “ice nucleated” storm in late December. The massive amount of heavy metals in these snow storms, tested at the state certified lab, proves our storms are being “seeded”. This “seeding” of artificial ice nucleating agents is accomplished by spray disbursement into the clouds of a weather system by jet aircraft.
Though the geoengineering programs can and do cool very expansive regions, there is a paradox, it comes at the cost of a worsened warming of the climate overall. The more they spray, the more they have to spray to cover up the damage already done. In addition, as already covered, the “engineered weather” comes at extreme cost to the environment as a whole. It is also important to consider there are likely many as of yet unknown aspects of the global spraying agenda.
The More They Spray, The Less It Will Rain Overall
So, as previously stated: in general, when excessive geoengineering chemical spraying is done to cool the temps down, the precipitation goes down accordingly. The spraying can and does blot out the sun by the creation of very large-scale upper level haze/cloud cover. When clouds are super saturated with toxic heavy metal and/or chemical particles of a very small size, (10 nanometer particles are specified as a preferred size by geoengineers and some geoengineering patents) then there are to many “condensation nuclei” for cloud droplets to combine and fall as rain. Storm clouds can be blown apart into an expansive, mostly rainless, and often featureless upper level canopy of haze, sometimes spanning immense distances (hundreds or even thousands of miles).
Again, the blocking of sunlight and the effect of ice nucleating agents/particulates cools the air mass below the clouds but at the cost of reducing or even eliminating precipitation.
Conversely, if the spraying is reduced enough, total available condensation nuclei is reduced. This allows the cloud droplets to combine and fall as rain though the temperatures, then generally remain well above normal given the time of year and the region.
At this point, the atmosphere has been so devastated from the decades-long geoengineering programs, and so saturated with the toxic metal and chemical fallout from the constant spraying, there is virtually no completely “natural” weather.
In the case of the maps above, the “scheduled” weather would appear to be heavy spraying of incoming storm fronts as they pass across the west/southwest. The temps are thus “predicted to drop, with far below average precipitation. Once over the eastern half of the US, spraying will either be reduced or larger particulates could be used in the spraying and the moisture which was migrated across the west will come down in the east, perhaps in a deluge. Again, there are always unknown variables in the precarious realm of total weather manipulation.
What Is the Environmental Cost Of The Geoengineering?
This question can never be adequately answered, as the decimation to the planet and the entire web of life from 60 years of ever-increasing weather modification with toxic spraying can never be quantified.
We now have massive global ozone destruction in the northern and southern hemispheres, mass extinction of plant and animal species (now estimated to be as high as 10,000 times background extinction rates), total disruption of natural weather patterns, and a complete toxification of our air, water, and soils. How long can life on Earth sustain this total assault?
Though humanity has damaged the biosphere on many fronts, all available data indicates that no single cause of environmental destruction even comes close to the total decimation being inflicted by geoengineering/chemtrails.
Geoengineering must be brought into the light of day, and it’s up to each and every one of us to get this done. Educate yourself on this issue. Arm yourself with credible data. A well thought out flyer and a copy of Michael Murphy’s Why In The World Are They Spraying can do wonders to wake up those that have so far kept their “head in the sand”. Once critical mass of awareness is reached, and those who actually carry out these programs realize that they are pulling the noose around their own neck along with the rest of us, we will have a chance at stopping these lethal spraying programs.
Original post @ http://www.activistpost.com/2013/02/extreme-weather-fluctuations-as-climate.html
What’s Wrong With The Weather?
Global geoengineering/weather modification programs are completely disrupting the planets natural weather patterns from top to bottom. The entire climate system is so totally out of balance at this point that it is swinging radically from one extreme to another. These massive fluctuations are being “forced” by the global climate/weather modification programs known as “solar radiation management” (SRM) and “stratospheric aerosol geoengineering” (SAG).
Adequate Precipitation, Or Colder Temperatures? Now, More Often Than Not, You Can’t Have Both, Thanks To Geoengineering
If one takes the time to examine the first NOAA map below, (temperature forecast) even without any meteorological experience, it is easy enough to recognize that there is extreme contrast. Temperature gradients should tear more from north to south, not from west to east.
The NOAA projected temperature map below is for the period from 2/10/13 to 2/14/13. The oranges to red colors with the letter “A” indicate “above” normal temperatures. The darker the color, the further above normal the temperatures are predicted to be. In this case, the darkest shaded areas would indicate a prediction in the 15 to 20 degree above “normal” range.
Toward the west/southwest US, a range of blue coloration (with the letter B for “below” normal) reflects far below average temperatures. In the darkest shaded areas these maps predict something in the 15 to 20 degree below average range.
The second map below reflects rainfall “predictions” (more accurately considered “scheduled” weather as virtually all the “weather modeling” maps for NOAA are now done by defense industry contractors like Raytheon, the same contractors conducting the geoengineering programs). The second map is for the same period as the first map. Again, areas with the “A” indicates a “prediction” of above normal rainfall. The darker the color, the further above normal. “B” is below normal rainfall. The darker the shaded area, the dryer it is “predicted” (scheduled) to be.
So How Is Geoengineering Effecting The “Forecast” Maps And The Weather?
Here is the important consideration between these maps, in general, the further above average the temperatures are, the more precipitation there will be. The lower the temps, the less precipitation there will be. At first glance this could seem straightforward enough. After all, the atmosphere does hold 7% more moisture for every degree of temperature rise, but that is not the full story any longer. The atmosphere is being completely saturated with toxic, reflective, desiccating geoengineering particulates, and the jet stream increasingly appears to be consistently manipulated with ionosphere heater installations. (See HAARP manipulates jet stream)
The more the geoengineers spray to try and cool down the temperatures, the less it will rain, period. (The science is clear on this. Google “geoengineering reduces rainfall”, there are simply too many studies on this to link only one.) Add “artificial ice nucleation” to the spray mix and the precipitation appears to go down still further. “Artificial ice nucleation” is a chemical process that can produce colder weather/cloud temperatures and snow out of what should have been a rain storm at well above freezing temperatures. If the temps are already cold enough for snow, this same process can lower the temperatures even further, but at a cost, not much snow compared to historical norms.
There are, of course, exceptions to this when a very moisture-laden storm system is ice nucleated, but the “snow” from such system “conversions” is “heavy wet snow”. This recent term coined by the Weather Channel describes the concrete-like “snow” that sticks like glue to trees and causes utter decimation to the forests. Broken and tipped over trees are everywhere in the Pacific Northwest from just such an “ice nucleated” storm in late December. The massive amount of heavy metals in these snow storms, tested at the state certified lab, proves our storms are being “seeded”. This “seeding” of artificial ice nucleating agents is accomplished by spray disbursement into the clouds of a weather system by jet aircraft.
Though the geoengineering programs can and do cool very expansive regions, there is a paradox, it comes at the cost of a worsened warming of the climate overall. The more they spray, the more they have to spray to cover up the damage already done. In addition, as already covered, the “engineered weather” comes at extreme cost to the environment as a whole. It is also important to consider there are likely many as of yet unknown aspects of the global spraying agenda.
The More They Spray, The Less It Will Rain Overall
So, as previously stated: in general, when excessive geoengineering chemical spraying is done to cool the temps down, the precipitation goes down accordingly. The spraying can and does blot out the sun by the creation of very large-scale upper level haze/cloud cover. When clouds are super saturated with toxic heavy metal and/or chemical particles of a very small size, (10 nanometer particles are specified as a preferred size by geoengineers and some geoengineering patents) then there are to many “condensation nuclei” for cloud droplets to combine and fall as rain. Storm clouds can be blown apart into an expansive, mostly rainless, and often featureless upper level canopy of haze, sometimes spanning immense distances (hundreds or even thousands of miles).
Again, the blocking of sunlight and the effect of ice nucleating agents/particulates cools the air mass below the clouds but at the cost of reducing or even eliminating precipitation.
Conversely, if the spraying is reduced enough, total available condensation nuclei is reduced. This allows the cloud droplets to combine and fall as rain though the temperatures, then generally remain well above normal given the time of year and the region.
At this point, the atmosphere has been so devastated from the decades-long geoengineering programs, and so saturated with the toxic metal and chemical fallout from the constant spraying, there is virtually no completely “natural” weather.
In the case of the maps above, the “scheduled” weather would appear to be heavy spraying of incoming storm fronts as they pass across the west/southwest. The temps are thus “predicted to drop, with far below average precipitation. Once over the eastern half of the US, spraying will either be reduced or larger particulates could be used in the spraying and the moisture which was migrated across the west will come down in the east, perhaps in a deluge. Again, there are always unknown variables in the precarious realm of total weather manipulation.
What Is the Environmental Cost Of The Geoengineering?
This question can never be adequately answered, as the decimation to the planet and the entire web of life from 60 years of ever-increasing weather modification with toxic spraying can never be quantified.
We now have massive global ozone destruction in the northern and southern hemispheres, mass extinction of plant and animal species (now estimated to be as high as 10,000 times background extinction rates), total disruption of natural weather patterns, and a complete toxification of our air, water, and soils. How long can life on Earth sustain this total assault?
Though humanity has damaged the biosphere on many fronts, all available data indicates that no single cause of environmental destruction even comes close to the total decimation being inflicted by geoengineering/chemtrails.
Geoengineering must be brought into the light of day, and it’s up to each and every one of us to get this done. Educate yourself on this issue. Arm yourself with credible data. A well thought out flyer and a copy of Michael Murphy’s Why In The World Are They Spraying can do wonders to wake up those that have so far kept their “head in the sand”. Once critical mass of awareness is reached, and those who actually carry out these programs realize that they are pulling the noose around their own neck along with the rest of us, we will have a chance at stopping these lethal spraying programs.
Original post @ http://www.activistpost.com/2013/02/extreme-weather-fluctuations-as-climate.html
2013年2月14日 星期四
2012 Glorious Hope - Soli Deo Gloria: LA Sinkhole: Apocalyptic Sounds Heard & Caught On ...
2012 Glorious Hope - Soli Deo Gloria: LA Sinkhole: Apocalyptic Sounds Heard & Caught On ...: I heard a sound from heaven like the sound of cascading waters and like the rumbling of loud thunder... Revelation 14:2 What are the loud...
2013年2月7日 星期四
Shocking Numbers That Show The Media Is Lying To You About Unemployment In America
by Michael
Snyder
Did you know that the percentage of the U.S. labor force that is employed has continually been falling since 2006 according to the Bureau of Labor Statistics? Did you know that the increase in the number of Americans "not in the labor force" during Barack Obama's first four years in the White House was more than three times greaterthan the increase in the number of Americans "not in the labor force" during the entire decade of the 1980s?
The mainstream media would have us believe that 157,000 jobs were added to the U.S. economy in January. Based on that news, the Dow broke the 14,000 barrier for the first time since October 2007. But if you actually look at the "non-seasonally adjusted" numbers, the number of Americans with a job actually decreased by 1,446,000 between December and January. But nowhere in the mainstream media did you hear that the U.S. economy lost more than 1.4 million jobs between December and January.
It is amazing the things that you can find out when you actually take the time to look at the hard numbers instead of just listening to the media spin. Back in 2007, more than 146 million Americanswere employed. Today, only 141.6 million Americans are employed even though our population has grown steadily since then. When the government and the media tell you that we are in a "recovery" and that unemployment is lower than it was a couple of years ago, I encourage you to dig deeper.
The truth is that even the government's own numbers tell us that the percentage of the U.S. labor force that is employed continues to fall and that the U.S. economy is heading into a recession. The Obama administration and the media have been lying to you about unemployment and about the true condition of our economy. After you see the numbers that I have compiled in this article, I think that you will agree with me.
First of all, let's take a look at the percentage of the civilian labor force that has been employed over the past several years. These numbers come directly from the Bureau of Labor Statistics. As you can see, this is a number that has been steadily falling since 2006...
2006: 63.1
2007: 63.0
2008: 62.2
2009: 59.3
2010: 58.5
2011: 58.4
In January, only 57.9 percent of the civilian labor force was employed.
Do the numbers above represent a positive trend or a negative trend?
Even a 2nd grader could answer that question.
So how in the world can the Obama administration and the mainstream media claim that the employment picture is getting better and that we are in a "recovery"?
But most Americans believe what they are told. It is almost as if we are in some kind of a "matrix" where reality is defined by the corporate-controlled propaganda that is relentlessly pumped into our brains.
The only way that the government has been able to show a declining unemployment rate is by dumping massive numbers of Americans into the "not in the labor force" category.
Just check out how the number of Americans "not in the labor force" has absolutely skyrocketed in recent years...
2006: 77,387,000
2007: 78,743,000
2008: 79,501,000
2009: 81,659,000
2010: 83,941,000
2011: 86,001,000
In January, there were supposedly 89,868,000 Americans that were at least 16 years of age that were not in the labor force.
That number has risen by more than 8 million since Barack Obama first entered the White House, and that is highly unusual, because the number of Americans "not in the labor force" only increased by 2,518,000 during the entire decade of the 1980s.
You sure can get the numbers to look more "favorable" if you pretend that millions upon millions of American workers simply "don't want a job" any longer. The truth is that if the labor force participation rate was at the same level it was at when Barack Obama was first elected, the official unemployment rate would be well above 10 percent.
But that wouldn't do at all, would it? 7.9 percent sounds so much nicer.
And of course even if you do have a job that does not mean that you are doing okay.
If you can believe it, in America today 41 percent of all workers make $20,000 a year or less.
To me, that is a mind blowing statistic. It would be incredibly challenging for anyone to live on $20,000 a year, much less try to support a family.
If you live in Washington D.C. or New York City and you have a "good job" working for the establishment, you may not realize it, but there are tens of millions of American families that are really hurting out there. According to the U.S. Census Bureau, more than 146 million Americansare either "poor" or "low income" at this point, and most of those people actually do have jobs.
For much more on the "working poor" in the United States, please see my previous article entitled "35 Statistics About The Working Poor In America That Will Blow Your Mind".
If something is not done, the middle class will continue to disappear and poverty in America will continue to explode.
In a previous article, I noted that during Obama's first term, the number of Americans on food stamps increased by an average of about 11,000 per day.
How bad do things have to get before people realize that we are living through a nightmare?
Most Americans are still convinced that our politicians will somehow find a way to turn things around.
Most Americans will gather around their television sets this weekend and watch the Super Bowl and laugh at all the funny commercials without even thinking about how America is literally falling apart all around them.
But there is one group of Americans that is acutely aware of how bad things have really gotten. Small businesses have traditionally been the primary engine of job growth in this country, but right now small business owners all over the nation are facing a tremendous crisis.
Millions of small businesses are on the verge of extinction, and yet our politicians just continue to pile on more taxes, more rules and more regulations.
A recent Gallup poll found that 61 percent of all small business owners in America are "worried about the potential cost of healthcare", and that an astounding 30 percent of all small business owners in America are not hiring and fear that they will go out of business within the next 12 months.
In a previous article entitled "We Are Witnessing The Death Of Small Business In America", I detailed how small businesses in America are being systematically wiped out. Small businesses are dying all around us, and the number of new small businesses continues to decline.
According to economist Tim Kane, the following is how the decline in the number of start-up jobs per one thousand Americans breaks down by presidential administration...
Bush Sr.: 11.3
Clinton: 11.2
Bush Jr.: 10.8
Obama: 7.8
Is that a good trend or a bad trend?
All of this is so simple that even the family pet should be able to figure it out, and yet most Americans seem oblivious to all of this. They just keep gobbling up the mainstream media propaganda and they just continue to go out and wildly spend money.
It is almost as if we didn't learn any lessons from 2008.
Even while household spending in Europe has moderated, household spending in the United States continues to soar. Just check out the chart in this article.
And guess what? The infamous "no money down mortgages" are back. If we wait long enough, perhaps "interest only mortgages" will make a comeback as well.
Unfortunately, I am afraid that time is running out. We have been living in the biggest debt bubble in the history of the world, and it is only a matter of time until it bursts.
2008 was just a "hiccup" compared to what is coming. Our politicians and the Federal Reservewere able to keep the house of cards from completely crashing down back then, but they are not going to be able to avert the economic horror show that is rapidly approaching.
I hope that you are getting prepared. Back in 2008, millions of Americans suddenly lost their jobs, and because many of them did not have any savings, many of them suddenly lost their homes. One of the most important things that you can do to prepare for the coming crisis is to build up an emergency fund. If things suddenly go bad, you don't want to lose your house and everything that you have always worked for.
In addition, anything that you can do to become more self-sufficient and more independent of the system is a good thing, because the system is failing. The years ahead are going to be much more chaotic than what we are experiencing right now, and when the next crisis strikes you will be very thankful for the time and the energy that you put into preparing.
So what are all of you seeing in your own areas?
Are businesses shutting down?
Are people having a hard time finding good jobs?
Original Post @ http://www.activistpost.com/2013/02/shocking-numbers-that-show-media-is.html
Did you know that the percentage of the U.S. labor force that is employed has continually been falling since 2006 according to the Bureau of Labor Statistics? Did you know that the increase in the number of Americans "not in the labor force" during Barack Obama's first four years in the White House was more than three times greaterthan the increase in the number of Americans "not in the labor force" during the entire decade of the 1980s?
The mainstream media would have us believe that 157,000 jobs were added to the U.S. economy in January. Based on that news, the Dow broke the 14,000 barrier for the first time since October 2007. But if you actually look at the "non-seasonally adjusted" numbers, the number of Americans with a job actually decreased by 1,446,000 between December and January. But nowhere in the mainstream media did you hear that the U.S. economy lost more than 1.4 million jobs between December and January.
It is amazing the things that you can find out when you actually take the time to look at the hard numbers instead of just listening to the media spin. Back in 2007, more than 146 million Americanswere employed. Today, only 141.6 million Americans are employed even though our population has grown steadily since then. When the government and the media tell you that we are in a "recovery" and that unemployment is lower than it was a couple of years ago, I encourage you to dig deeper.
The truth is that even the government's own numbers tell us that the percentage of the U.S. labor force that is employed continues to fall and that the U.S. economy is heading into a recession. The Obama administration and the media have been lying to you about unemployment and about the true condition of our economy. After you see the numbers that I have compiled in this article, I think that you will agree with me.
First of all, let's take a look at the percentage of the civilian labor force that has been employed over the past several years. These numbers come directly from the Bureau of Labor Statistics. As you can see, this is a number that has been steadily falling since 2006...
2006: 63.1
2007: 63.0
2008: 62.2
2009: 59.3
2010: 58.5
2011: 58.4
In January, only 57.9 percent of the civilian labor force was employed.
Do the numbers above represent a positive trend or a negative trend?
Even a 2nd grader could answer that question.
So how in the world can the Obama administration and the mainstream media claim that the employment picture is getting better and that we are in a "recovery"?
But most Americans believe what they are told. It is almost as if we are in some kind of a "matrix" where reality is defined by the corporate-controlled propaganda that is relentlessly pumped into our brains.
The only way that the government has been able to show a declining unemployment rate is by dumping massive numbers of Americans into the "not in the labor force" category.
Just check out how the number of Americans "not in the labor force" has absolutely skyrocketed in recent years...
2006: 77,387,000
2007: 78,743,000
2008: 79,501,000
2009: 81,659,000
2010: 83,941,000
2011: 86,001,000
In January, there were supposedly 89,868,000 Americans that were at least 16 years of age that were not in the labor force.
That number has risen by more than 8 million since Barack Obama first entered the White House, and that is highly unusual, because the number of Americans "not in the labor force" only increased by 2,518,000 during the entire decade of the 1980s.
You sure can get the numbers to look more "favorable" if you pretend that millions upon millions of American workers simply "don't want a job" any longer. The truth is that if the labor force participation rate was at the same level it was at when Barack Obama was first elected, the official unemployment rate would be well above 10 percent.
But that wouldn't do at all, would it? 7.9 percent sounds so much nicer.
And of course even if you do have a job that does not mean that you are doing okay.
If you can believe it, in America today 41 percent of all workers make $20,000 a year or less.
To me, that is a mind blowing statistic. It would be incredibly challenging for anyone to live on $20,000 a year, much less try to support a family.
If you live in Washington D.C. or New York City and you have a "good job" working for the establishment, you may not realize it, but there are tens of millions of American families that are really hurting out there. According to the U.S. Census Bureau, more than 146 million Americansare either "poor" or "low income" at this point, and most of those people actually do have jobs.
For much more on the "working poor" in the United States, please see my previous article entitled "35 Statistics About The Working Poor In America That Will Blow Your Mind".
If something is not done, the middle class will continue to disappear and poverty in America will continue to explode.
In a previous article, I noted that during Obama's first term, the number of Americans on food stamps increased by an average of about 11,000 per day.
How bad do things have to get before people realize that we are living through a nightmare?
Most Americans are still convinced that our politicians will somehow find a way to turn things around.
Most Americans will gather around their television sets this weekend and watch the Super Bowl and laugh at all the funny commercials without even thinking about how America is literally falling apart all around them.
But there is one group of Americans that is acutely aware of how bad things have really gotten. Small businesses have traditionally been the primary engine of job growth in this country, but right now small business owners all over the nation are facing a tremendous crisis.
Millions of small businesses are on the verge of extinction, and yet our politicians just continue to pile on more taxes, more rules and more regulations.
A recent Gallup poll found that 61 percent of all small business owners in America are "worried about the potential cost of healthcare", and that an astounding 30 percent of all small business owners in America are not hiring and fear that they will go out of business within the next 12 months.
In a previous article entitled "We Are Witnessing The Death Of Small Business In America", I detailed how small businesses in America are being systematically wiped out. Small businesses are dying all around us, and the number of new small businesses continues to decline.
According to economist Tim Kane, the following is how the decline in the number of start-up jobs per one thousand Americans breaks down by presidential administration...
Bush Sr.: 11.3
Clinton: 11.2
Bush Jr.: 10.8
Obama: 7.8
Is that a good trend or a bad trend?
All of this is so simple that even the family pet should be able to figure it out, and yet most Americans seem oblivious to all of this. They just keep gobbling up the mainstream media propaganda and they just continue to go out and wildly spend money.
It is almost as if we didn't learn any lessons from 2008.
Even while household spending in Europe has moderated, household spending in the United States continues to soar. Just check out the chart in this article.
And guess what? The infamous "no money down mortgages" are back. If we wait long enough, perhaps "interest only mortgages" will make a comeback as well.
Unfortunately, I am afraid that time is running out. We have been living in the biggest debt bubble in the history of the world, and it is only a matter of time until it bursts.
2008 was just a "hiccup" compared to what is coming. Our politicians and the Federal Reservewere able to keep the house of cards from completely crashing down back then, but they are not going to be able to avert the economic horror show that is rapidly approaching.
I hope that you are getting prepared. Back in 2008, millions of Americans suddenly lost their jobs, and because many of them did not have any savings, many of them suddenly lost their homes. One of the most important things that you can do to prepare for the coming crisis is to build up an emergency fund. If things suddenly go bad, you don't want to lose your house and everything that you have always worked for.
In addition, anything that you can do to become more self-sufficient and more independent of the system is a good thing, because the system is failing. The years ahead are going to be much more chaotic than what we are experiencing right now, and when the next crisis strikes you will be very thankful for the time and the energy that you put into preparing.
So what are all of you seeing in your own areas?
Are businesses shutting down?
Are people having a hard time finding good jobs?
Original Post @ http://www.activistpost.com/2013/02/shocking-numbers-that-show-media-is.html
2013年2月4日 星期一
Codex Alimentarius and GM Food Guidelines, Pt. 1
by Brandon
Turbeville
Over the last two years, I have written extensively about the Codex Alimentarius guidelines and how they relate specifically to vitamin and mineral supplements, food irradiation, and the use of Recombinant Bovine Growth Hormone (rBGH).
I have also detailed the history and workings of the international organization as well as many of the current day to day manifestations of Codex guidelines as they appear in domestic policy.
However, there is yet another area in which Codex guidelines will play a major role in the development of food policy – namely, the proliferation of Genetically Modified Food.
The Codex committee that serves as the main battleground for the consideration of GM food is the Codex Committee on Food Labeling. This committee is extremely relevant due to the fact that it can effectively reduce the power of the consumer to virtually nothing if it decides not to force companies or countries to label their GM food, thus removing the ability of the consumer to boycott and/or avoid those products. While it is well-known that public sentiment is unimportant to those at the top, governments and corporations tend to pay more attention when votes and sales reflect that sentiment. However, if Codex continues on its’ way to allowing unlabelled GM food onto the international market, the repercussions of consumer reaction will be entirely neutralized.
A brief discussion of the history of Codex in terms of GM food is necessary here to understand the direction that the organization is moving towards in regards to it.
For most of the seventeen years that Codex member countries have debated the safety of genetic modification of the food supply, the result has been little or no progress for one side or the other.
In 1993, at the behest of the Codex Commission, the CCFL agreed to begin working on the labeling aspect of GM food. Interestingly enough, the CCFL asked the United States, the country that was the most militant in its support of genetic modification, to develop a paper that would guide the committee’s discussion at the following session. When this session arrived, there was a flurry of opinions tossed around from several different countries. The most sensible position was that all GM foods should be labeled under any circumstances. Yet other countries, especially the pro-Gm ones, argued that labeling should only be required when there is the introduction of health or safety concerns, allergens, or when the food is significantly different from its traditional counterpart.[1] This is a debate that largely continues until this day.
The concept of “substantial equivalence” versus “process-based” labeling has also become one of the most hotly contested issues within the Codex GM food labeling debate. Process-based labeling simply means that the driving factor behind the labeling guidelines is the process by which the food is created, grown, or otherwise produced. Therefore, the qualifying factor for labeling GM food would be the process of genetic modification itself, forcing all GM food to be labeled as such. This is essentially the mandatory labeling of all GM food. When this concept was first introduced in 2001, it was supported by such countries as the European Union, India, and Norway. Its staunchest opponents, of course, were the United States and Canada.[2] Although this method of labeling standards was by far the most sensible if one were concerned about food safety and consumer rights of choice, it has been all but abandoned since the brief discussion at its introduction. The attention then has necessarily turned to the competing set of standards known as “substantial equivalence.”
“Substantial equivalence” guidelines are by far the most onerous means by which to label GM food outside of the scheme of voluntary labeling (such as what Canada has already pushed for).[3]
This set of standards not only provides loopholes through which GM food may enter the food supply, but also opens the door to total acceptance of GM food absolutely free of labeling. The idea behind the substantial equivalence labeling method is that the GM food will be compared to its conventional counterpart in terms of safety and composition.[4]
The food would then only require a label if it was found that there was a substantial difference between the GM product and the natural food or there were an introduction of a common allergen through the process of genetic modification. While at first it may seem that there is a legitimate consideration of safety under these principles, such an impression is far from the truth.
Several problems exist with the concept of substantial equivalence. First, as is often the case with government and bureaucratic initiatives, the semantics of the term “substantial equivalence” leaves the door open to the possible acceptance of virtually all GM food. While I will discuss this aspect further in future articles where the accepted Codex guidelines for testing GM food is mentioned, brief mention is still required early on in order to understand the dangers of the use of this labeling standard.
In order for a food to require labeling, it must do one of two things – introduce a new allergen or be significantly different from its “traditional counterpart.”[5] The former requirement refers to the introduction of something along the lines of the peanut gene or the introduction of another common allergy to a food, thereby causing a potential allergic reaction to the food after consuming it. However, there are thousands of food allergies besides peanuts. Codex itself admits in its GM food test protocol that the determination of what may be an allergy is a very difficult procedure. It says “At present, there is no definitive test that can be relied upon to predict allergic response in humans to a newly expressed protein.”[6]
Although the guidelines go on to say that these potential allergens should be tested on a case-by-case basis, it is clear that the testing mechanisms being recommended are not necessarily geared for determining the potential allergenicity of newly introduced GM foods. Especially on the scale that is needed to deal with the immense diversity of GM prototypes being introduced and the even greater variety of individual allergies that exist in the population.
It should also be noted that while there is some discussion of known allergens, there is no in-depth discussion of the very real possibility of new and previously unknown allergens being introduced due to the process of genetic modification. Indeed, the monitoring of the food once it enters the food chain is only occasionally mentioned throughout the Codex “Foods Derived From Modern Biotechnology” document and those mentions are vague and open-ended.[7] So the question that follows is whether or not all of these potential allergens will be labeled as such, or if only the most common ones will be considered.
Second, the requirement that a food must be compared and found substantially equivalent to its “traditional counterpart” (natural food) is misleading as well. To begin with, one must ask the question of what exactly “substantial equivalence” means. Quite obviously, the term does not mean that the GM product must be identical. This, in itself would negate the process of genetic modification.
Therefore, differences must necessarily be accepted. However, it is not at all clear just to what level these differences may exist and still be considered equivalent and/or safe. Nowhere is “substantial equivalence” clearly defined. The criterion for what is substantial and what is not is left completely open and subjective.
The closest thing there is to a definition is made by Nick Tomlinson of the UK Food Standards Agency in his report, “Joint FAO/WHO Expert Consultation on Foods Derived from Biotechnology” where he references the 1996 expert consultation where substantial equivalence was defined as “being established by a demonstration that the characteristics assessed for the genetically modified organism, or the specific food product derived there from, are equivalent to the same characteristics of the conventional comparator.”[8]
Here again the term equivalence is used with the connotation that equivalent does not translate into identical or same. Tomlinson makes this clear when he says:
The concept of substantial equivalence is unfortunately the theory of labeling requirements adopted by Codex. It is also very similar to the criteria used in the United States and Canada.
As to be expected in such pro-GM countries as the United States, the GM labeling requirements are even less restrictive than those of Codex. For the most part, labeling of GM foods in the United States and Canada is completely voluntary.
Sources:
[1] MacKenzie, Anne. A. “The Process of Developing Labeling Standards For GM Foods In The Codex Alimentarius.” AgBioForum, Vol.3, Number 4, 2000. pp. 203-208.http://www.agbioforum.org/v3n4/v3n4a04-mackenzie.htm Accessed May 24, 2010.
[2] “Canadians Deserve To Know What They Are Eating: Food Safety Must Come Before Trade.” Canadian Health Coalition, Media Advisory, May 1-4, 2001.http://www.healthcoalition.ca/codex.html
[3] Ibid.
[4] “Safety aspects of genetically modified foods of plant origin, a joint FAO/WHO consultation on foods derived from biotechnology, Geneva, Switzerland 29 May – 2 June 2000”. World Health Organization. http://www.who.int/foodsafety/publications/biotech/ec_june2000/en/index.html
[5] MacKenzie, Anne. A. “The Process of Developing Labeling Standards For GM Foods In The Codex Alimentarius.” AgBioForum, Vol.3, Number 4, 2000. pp. 203-208.http://www.agbioforum.org/v3n4/v3n4a04-mackenzie.htm May 24, 2010.
[6] “Food Derived From Modern Biotechnology.” Codex Alimentarius 2nd Edition. P.20ftp://ftp.fao.org/codex/Publications/Booklets/Biotech/Biotech_2009e.pdf
[7] Ibid.
[8] Tomlinson, Nick. “Joint FAO/WHO Expert Consultation on Foods Derived from Biotechnology.” 2003. ftp://ftp.fao.org/es/esn/food/Bio-03.pdf Accessed May 24, 2010.
[9] Ibid.
[10] “Guidance For Industry: Voluntary Labeling Indicating Whether Foods Have or Have Not Been Developed Using Biotengineering: Draft Guidance.” Food and Drug Administration. January 2001.http://www.fda.gov/Food/ GuidanceComplianceRegulatoryIn formation/GuidanceDocuments/ FoodLabelingNutrition/ ucm059098.htm
This voluntary labeling scheme based on the concept of substantial equivalence is both a prime example of the weakness of both standards as well as a dark omen as to the direction of Codex guidelines as they continue to be developed.[10]
Original post @ http://www.activistpost.com/2013/02/codex-alimentarius-and-gm-food.html
Over the last two years, I have written extensively about the Codex Alimentarius guidelines and how they relate specifically to vitamin and mineral supplements, food irradiation, and the use of Recombinant Bovine Growth Hormone (rBGH).
I have also detailed the history and workings of the international organization as well as many of the current day to day manifestations of Codex guidelines as they appear in domestic policy.
However, there is yet another area in which Codex guidelines will play a major role in the development of food policy – namely, the proliferation of Genetically Modified Food.
The Codex committee that serves as the main battleground for the consideration of GM food is the Codex Committee on Food Labeling. This committee is extremely relevant due to the fact that it can effectively reduce the power of the consumer to virtually nothing if it decides not to force companies or countries to label their GM food, thus removing the ability of the consumer to boycott and/or avoid those products. While it is well-known that public sentiment is unimportant to those at the top, governments and corporations tend to pay more attention when votes and sales reflect that sentiment. However, if Codex continues on its’ way to allowing unlabelled GM food onto the international market, the repercussions of consumer reaction will be entirely neutralized.
A brief discussion of the history of Codex in terms of GM food is necessary here to understand the direction that the organization is moving towards in regards to it.
For most of the seventeen years that Codex member countries have debated the safety of genetic modification of the food supply, the result has been little or no progress for one side or the other.
In 1993, at the behest of the Codex Commission, the CCFL agreed to begin working on the labeling aspect of GM food. Interestingly enough, the CCFL asked the United States, the country that was the most militant in its support of genetic modification, to develop a paper that would guide the committee’s discussion at the following session. When this session arrived, there was a flurry of opinions tossed around from several different countries. The most sensible position was that all GM foods should be labeled under any circumstances. Yet other countries, especially the pro-Gm ones, argued that labeling should only be required when there is the introduction of health or safety concerns, allergens, or when the food is significantly different from its traditional counterpart.[1] This is a debate that largely continues until this day.
The concept of “substantial equivalence” versus “process-based” labeling has also become one of the most hotly contested issues within the Codex GM food labeling debate. Process-based labeling simply means that the driving factor behind the labeling guidelines is the process by which the food is created, grown, or otherwise produced. Therefore, the qualifying factor for labeling GM food would be the process of genetic modification itself, forcing all GM food to be labeled as such. This is essentially the mandatory labeling of all GM food. When this concept was first introduced in 2001, it was supported by such countries as the European Union, India, and Norway. Its staunchest opponents, of course, were the United States and Canada.[2] Although this method of labeling standards was by far the most sensible if one were concerned about food safety and consumer rights of choice, it has been all but abandoned since the brief discussion at its introduction. The attention then has necessarily turned to the competing set of standards known as “substantial equivalence.”
“Substantial equivalence” guidelines are by far the most onerous means by which to label GM food outside of the scheme of voluntary labeling (such as what Canada has already pushed for).[3]
This set of standards not only provides loopholes through which GM food may enter the food supply, but also opens the door to total acceptance of GM food absolutely free of labeling. The idea behind the substantial equivalence labeling method is that the GM food will be compared to its conventional counterpart in terms of safety and composition.[4]
The food would then only require a label if it was found that there was a substantial difference between the GM product and the natural food or there were an introduction of a common allergen through the process of genetic modification. While at first it may seem that there is a legitimate consideration of safety under these principles, such an impression is far from the truth.
Several problems exist with the concept of substantial equivalence. First, as is often the case with government and bureaucratic initiatives, the semantics of the term “substantial equivalence” leaves the door open to the possible acceptance of virtually all GM food. While I will discuss this aspect further in future articles where the accepted Codex guidelines for testing GM food is mentioned, brief mention is still required early on in order to understand the dangers of the use of this labeling standard.
In order for a food to require labeling, it must do one of two things – introduce a new allergen or be significantly different from its “traditional counterpart.”[5] The former requirement refers to the introduction of something along the lines of the peanut gene or the introduction of another common allergy to a food, thereby causing a potential allergic reaction to the food after consuming it. However, there are thousands of food allergies besides peanuts. Codex itself admits in its GM food test protocol that the determination of what may be an allergy is a very difficult procedure. It says “At present, there is no definitive test that can be relied upon to predict allergic response in humans to a newly expressed protein.”[6]
Although the guidelines go on to say that these potential allergens should be tested on a case-by-case basis, it is clear that the testing mechanisms being recommended are not necessarily geared for determining the potential allergenicity of newly introduced GM foods. Especially on the scale that is needed to deal with the immense diversity of GM prototypes being introduced and the even greater variety of individual allergies that exist in the population.
It should also be noted that while there is some discussion of known allergens, there is no in-depth discussion of the very real possibility of new and previously unknown allergens being introduced due to the process of genetic modification. Indeed, the monitoring of the food once it enters the food chain is only occasionally mentioned throughout the Codex “Foods Derived From Modern Biotechnology” document and those mentions are vague and open-ended.[7] So the question that follows is whether or not all of these potential allergens will be labeled as such, or if only the most common ones will be considered.
Second, the requirement that a food must be compared and found substantially equivalent to its “traditional counterpart” (natural food) is misleading as well. To begin with, one must ask the question of what exactly “substantial equivalence” means. Quite obviously, the term does not mean that the GM product must be identical. This, in itself would negate the process of genetic modification.
Therefore, differences must necessarily be accepted. However, it is not at all clear just to what level these differences may exist and still be considered equivalent and/or safe. Nowhere is “substantial equivalence” clearly defined. The criterion for what is substantial and what is not is left completely open and subjective.
The closest thing there is to a definition is made by Nick Tomlinson of the UK Food Standards Agency in his report, “Joint FAO/WHO Expert Consultation on Foods Derived from Biotechnology” where he references the 1996 expert consultation where substantial equivalence was defined as “being established by a demonstration that the characteristics assessed for the genetically modified organism, or the specific food product derived there from, are equivalent to the same characteristics of the conventional comparator.”[8]
Here again the term equivalence is used with the connotation that equivalent does not translate into identical or same. Tomlinson makes this clear when he says:
The levels and variation for characteristics in the genetically modified organism must be within the natural range of variation for those characteristics considered in the comparator and be based upon an appropriate analysis of data.[9]By not exactly being descriptive as to how wide a range this “natural range of variation” may be, it is apparent that substantial equivalence does not correlate to identical or even anything that would remotely be considered the “same.” Indeed, the very nature of genetic modification precludes this as a possibility to begin with.
The concept of substantial equivalence is unfortunately the theory of labeling requirements adopted by Codex. It is also very similar to the criteria used in the United States and Canada.
As to be expected in such pro-GM countries as the United States, the GM labeling requirements are even less restrictive than those of Codex. For the most part, labeling of GM foods in the United States and Canada is completely voluntary.
Sources:
[1] MacKenzie, Anne. A. “The Process of Developing Labeling Standards For GM Foods In The Codex Alimentarius.” AgBioForum, Vol.3, Number 4, 2000. pp. 203-208.http://www.agbioforum.org/v3n4/v3n4a04-mackenzie.htm Accessed May 24, 2010.
[2] “Canadians Deserve To Know What They Are Eating: Food Safety Must Come Before Trade.” Canadian Health Coalition, Media Advisory, May 1-4, 2001.http://www.healthcoalition.ca/codex.html
[3] Ibid.
[4] “Safety aspects of genetically modified foods of plant origin, a joint FAO/WHO consultation on foods derived from biotechnology, Geneva, Switzerland 29 May – 2 June 2000”. World Health Organization. http://www.who.int/foodsafety/publications/biotech/ec_june2000/en/index.html
[5] MacKenzie, Anne. A. “The Process of Developing Labeling Standards For GM Foods In The Codex Alimentarius.” AgBioForum, Vol.3, Number 4, 2000. pp. 203-208.http://www.agbioforum.org/v3n4/v3n4a04-mackenzie.htm May 24, 2010.
[6] “Food Derived From Modern Biotechnology.” Codex Alimentarius 2nd Edition. P.20ftp://ftp.fao.org/codex/Publications/Booklets/Biotech/Biotech_2009e.pdf
[7] Ibid.
[8] Tomlinson, Nick. “Joint FAO/WHO Expert Consultation on Foods Derived from Biotechnology.” 2003. ftp://ftp.fao.org/es/esn/food/Bio-03.pdf Accessed May 24, 2010.
[9] Ibid.
[10] “Guidance For Industry: Voluntary Labeling Indicating Whether Foods Have or Have Not Been Developed Using Biotengineering: Draft Guidance.” Food and Drug Administration. January 2001.http://www.fda.gov/Food/
This voluntary labeling scheme based on the concept of substantial equivalence is both a prime example of the weakness of both standards as well as a dark omen as to the direction of Codex guidelines as they continue to be developed.[10]
Original post @ http://www.activistpost.com/2013/02/codex-alimentarius-and-gm-food.html
訂閱:
文章 (Atom)