The weather has been rotten for the last two days and during this time events and conversations have made me wonder if, attitudes have ever really changed in certain countries or whether once again we are just fooling ourselves. Over the years and definitely since the 1960’s attitudes in many areas of life have changed tremendously, especially with regards to race, sexuality and the balance between the two sexes. When it comes to genuine attitudes around the world things have moved on but in the UK and America especially, attitude are still on the whole still very ‘Victorian’.
In France, Germany, Spain and many other European countries the attitudes to life and naturism especially is very live and let live, where here in the UK and in America is very hush hush, don’t tell anyone, hide it away and the urge to control things comes from the powers that be.
View original post 606 more words