Food Choices and Teenagers

I have the pleasure of being a parent to teenagers and I really do mean that. The only real disagreements that occurred were to do with food. Suddenly, regular mealtimes were a thing of the past as the teens would try to eat at odd times and often eat out at “friends” or more truthfully, the nearest fast food restaurant or bakers! There would also be signs of empty crisp packets, fizzy drinks or, even worse, energy drinks left in their bedrooms. My children had become teenagers!

Even though we usually consider the teenage “stage” beginning at the age of 13, arguably, it is really when the child enters puberty. Adolescence is defined by the World Health Organization (WHO) as the “period in human growth and development that occurs after childhood and before adulthood, from ages 10 to 19Untitled“. As puberty begins, hormonal changes causes physical developments in the body which can, in turn, influence emotions. Often the adolescent will begin to question his or her parents advice and values, preferring to turn to their peers for guidance and not communicating well with family members. Suddenly, peer influence becomes important and this may have an impact on eating habits.

Teenagers often become “grazers”, eating as they get hungry rather than at set mealtimes which can make it difficult to ensure they have a healthy diet. Family mealtimes are of significant importance when trying to ensure adolescents eat sufficient vegetables and fruit. In addition to this, many young adults will skip breakfast, which may have negative consequences on weight gain, glycemic levels  and missing out on various health benefits which have been attributed to eating breakfast (See here). For instance, eating breakfast cereals are considered to help fulfil the Reference Nutritional Intake (see here for a clear explanation of RNI and nutrient intake) and possibly have a further impact of eating throughout the day.

As an example of how teenagers are failing to eat to desired requirements, according to the National Diet and Nutrition Survey (NDNS), which was published in 2014, ten percent of boys aged between 11 and 18 met the UK “5 a day” recommendations for fruit and vegetables, with only seven percent of girls in the same age group meeting the target amounts. Furthermore, eating oily fish was very low with an average of 21g being consumed by 16-24 years olds, in comparison to the 140g per week recommended. Macronutrient and micronutrient needs change during puberty; a few examples of this can be seen below (as taken from here) :

  • Boys aged 15-18 require increased protein intake- 55.2g in comparison to 28.3g for children aged 7-10
  • Children aged 7-10 need 8.7mg of  Iron, whereas females from age 11 up to age 50 require 14.8mg of Iron.
  • Both Males and Females need an increase in calcium during ages 11-18.

Adolescence can often also be a time for experimentation. Teenagers often start trying alcohol and the latest findings from the NDNS shows that forty percent of participants aged between 16 and 24 drank alcohol in a four day period, but of more concern, five percent of children aged 11-15 admitted to drinking alcohol in the same period (see chapter 8 of the NDNS).

For parents and carers, adolescence can be a challenging time to try to ensure that the young person is eating regularly and, most importantly, eating a wide variety of nutrients. Sitting down to meals together as a family at least a few times a week may, just a little, help to set parents minds at rest that their teenager is eating well…..occasionally!

Advertisements

Mmm…..Chocolate!

3571301967_811cb5f7c4_q (1)Like many other people I like to eat chocolate. It is an amazing product that has been consumed by humans for thousands of years, initially as a hot drink and eventually as the chocolate bars we know (and love) today. Cocoa is first thought to have been used by the Olmec people who are believed to have grown Theobroma cocoa trees as a crop. This use of cocoa beans was, in turn, passed onto the Mayans who then traded with the Aztec people (please see this History of Chocolate timeline).

Chocolate is often considered a guilty pleasure; bad for us, causes spots and piles on the calories, but there is much evidence to show that chocolate doesn’t quite deserve the bad image that it is labelled with. Throughout history, chocolate, or cocao has been considered as a remedy for many ailments, by many different cultures, including being used to treat fatigue, stimulate digestion and reduce fevers, as researched by Dillinger et al, 2000The research published in the Journal of Nutrition, investigated worldwide evidence as to the use of chocolate to cure and aid patients with a variety of illnesses, often using the addition of numerous spices, much of which varied depending on the ailment. One of the “cures” agreed on by nearly all of the documented evidence was the usefulness of chocolate in aiding weight gain in patients who were underweight. No surprises there, unfortunately!

One quality that chocolate has recently been attributed with is that it contains dietary flavonoids, which can be responsibly for having an antioxidant effect on the body; in this case by promoting cardiovascular health, according to various articles (see here, here and here). This is possible because our guts have good bacteria, for example Bifidobacterium, which ferment the cocoa found in dark chocolate. Cocoa is made up of polyphenol compounds, namely, catechin and epicatechin, with some dietary fibre, which are not easily absorbed by the human body, but are metabolised by the good bacteria into smaller molecules. These smaller molecules are now easier to absorb and also have developed anti-inflammatory properties, which can lower inflammation in the body, importantly in cardiovascular tissue, which may be able to reduce the risk of strokes in the long term.

Unfortunately, any health benefit is limited to the consumption of moderate amounts of dark chocolate only as there is evidence to show that milk ( so, milk chocolate) inhibits the absorption of the antioxidants ingested from the chocolate. A review of the effect of cocoa on blood pressure found that a small reduction in blood pressure could be achieved over the short term and recommended further investigations into longer term effects. Polyphenol-rich Dark Chocolate (PRDC) has also been considered as a way of improving insulin sensitivity with a view to delaying the onset of diabetes. In a study shown by the  Society of Endocrinology PRDC was tested on 61 non-diabetic adults over a four week period in a randomized controlled parallel test with results showing a significant lowering of insulin levels from volunteers taking 20g of the PRDC in comparison to those ingesting 20g of a placebo chocolate.

All in all, I love the whole idea of cocoa, it has a long history and strong connection to past 9187389664_ac9c7383aa_mcivilisations; it is often made by artisans, who are truly passionate about their product, into the most wonderful of little treats and frequently chocolates are given as a romantic gift for a loved one. If chocolate is also considered as an aid to a healthy diet, then who am I to argue with science……though I will hint for polyphenol rich dark chocolates this Valentine’s day!

Bring on Compulsory Cooking in Schools

A few years ago (!) I was very fortunate to be a food technician in a school which had a fantastically inspiring food technology teacher. This lady had been teaching pupils about the importance of healthy diets, locally sourced food and food miles many years before they became buzz words for politicians and campaigners. As an older teacher nearing retirement, she had weathered all types of changes in education, specifically as she initially worked at a school which had a farm and “field studies” really had meaning! The subject of Food in schools has had some differing names: Land Studies, Rural Science, Home Economics, Food Technology, Food and Nutrition and Catering, but no matter what the name, its importance as a subject for young people is apparent; which my Food Technology teacher believed to her core! Many children do not learn how to cook by watching their mother as in years gone by and this skill is being sorely missed as many young people have no idea how or what to cook for a balanced diet, Consequently they may turn to ready-meals, which often contain high fat (especially saturated), high salt, high sugar and minimal amounts of vegetables. With health issues such as obesity and heart disease, plus environmental concerns such as food waste; surely cooking skills of some kind should be essential in schools?

In 2008 a compulsory cooking in schools programme, called Licence to Cook (also information available from The National STEM centre), was announced by Ed Balls and Alan Johnson in conjunction with a joint obesity strategy called “Healthy weight, healthy lives“. This plan was celebrated by many in education and funding was made available to start training more food teachers and Higher Level Teaching Assistants (HLTA) in the run up to 2011 when compulsory cookery lessons would begin in secondary schools. The idea was to move Food Technology from teaching about mass production, manufacturing and design towards more practical lessons based on teaching kitchen skills, nutrition, food safety, hygiene and even how to shop wisely. These are useful skills for a child’s personal development and could lead to improved nutritional choices as an adult (see here for some background information)

This was delayed when the Coalition Government decided to carry out a review of education but from September 2014 the teaching of Food and Nutrition is compulsory up to year 9 (so Key Stage 3) and is included in the National Curriculum for England Design and Technology programme. At Key Stage 3 students will be expected to be able to cook a range of dishes showing their ability to cook varied and healthy meals for themselves and others. They will be able to explore different cooking methods and adapt recipes to their own design.

Learning to cook is a valuable and important tool for young people so they can take a step towards independence and making food good choices, but in addition to this, cooking also integrates into other parts of the national curriculum:

  • Mathematics: weighing and measuring
  • Science: effect of heating , chemical reactions, bacterial control
  • English: Reading and writing recipes
  • Religious Studies, Modern Languages: Cultural restrictions and diets
  • Design and Technology: Designing new recipe ideas

I sincerely hope the future of Cooking in schools is safe and that all young people in the UK will have the opportunity to develop new skills enabling them to cook meals that will contribute to a healthy lifestyle.

It’s the time of year for dieting… or may I suggest a brisk walk..?

After the usual Christmas indulgences many people begin to consider the “D” word and evidence of this is in newspapers and magazines throughout January. The amount of new diets, health programmes, advice and research is staggering and even health professionals can be hard pushed to know which would be beneficial and which just will not work. When it comes to diets the short term gain (or should I say…loss…?) can be obvious “I lost 5lb last week”, but how do we know that the diet will help in the longer term? The Huffington Post published an article about the celebrity trainer Harley Pasternak’s travels to find the healthiest diets in the world and explore what they have in common. The diets he considered in the top five were:

  • The Mediterranean Diet (See here for further information)
  • New Nordic Diet (See here for further information)
  • Traditional Okinawa Diet (See here for further information)
  • Traditional Asian Diet (See here for further information)
  • “French Paradox” food (See here for further information

I can see merit in all five of the diets suggested above and having explored a little of the available research on some of these diets in the past I am sure there will be more evidence in the future to show the health benefits of (often) more traditional eating habits. However, as emphasised by Mr. Pasternak there is a common link between all of these diets which actually has nothing to do with food……. in all of the listed countries, most of the people walk a lot more than, for example, the average American.

Walking everyday has also been in the news recently with the Daily Mail reporting on a Cambridge University Study which found that walking 20 minutes a day may reduce the risks of premature death in adults. This was a large European study which included 334,161 men and women and was part of the European Prospective Investigation into Cancer and Nutrition (EPIC) study and spanned 20 years. The study was led by Professor Ulf Ekelund, who comments on the University of Cambridge research webpage, “This is a simple message: just a small amount of physical activity each day could have substantial health benefits for people who are physically inactive”. Additionally, back in June 2014 The Guardian newspaper ran an article about the Ramblers and Macmillan Cancer Support who produced a review to show how walking can make a difference to people’s lives; not just by helping to reduce weight, but by improving social connections and mental health. 

I cannot help but agree that physical activity is a key issue, possibly even THE key issue to reducing obesity and lowering heart disease, especially if it encourages people to step outside their homes and interact with other people. I’m lucky (sort of!) that I have a dog who, even when I would rather stay home and put my feet up, will need a walk. I have heard of people who “borrow” a dog from a friend or neighbour so they can walk it, which, if you have children can be an amazing incentive to leave the computer games and put on a coat. As for the top five diets researched by Harley Pasternak, they all have their good points and we can all learn from them, especially the suggestions of serving smaller portions and trying to eat seasonally locally sourced food…. maybe try walking to a local market and kill two birds with one stone…!

Should Obesity be classed as a disability?

With the recent ruling from the European Court of Justice stating that Obesity can be classed as a disability, I confess that as someone who is passionate about the importance of good diet and healthy lifestyle I found myself a little lost for words. In England, 24.7% of adults are obese according to the Health and Social Care Information Centre (HSCIC) survey 2014, which shows figures for 2012, with 61.9% of adults beScreen Shot 2015-01-02 at 09.27.57ing obese or overweight. These figures are showing a steady increase since 1993 as shown in the Health Survey for England 2012 graph shown here.

Not everyone who is obese or overweight has gained weight due to excess food consumption and lack of exercise. There are some medical reasons that a person may be gaining weight, for example:

This list is not extensive and people with conditions that affect their weight will usually be advised on how to control weight gain through diet and exercise, though I can fully appreciate the difficulties experienced with this.

Going back to the general population of people who are obese and who do not suffer from a medical reason which can affect weight gain, should they be classed as disabled due to a self-inflicted condition? The Oxford Dictionary defines the word disability as “a physical or mental condition that limits a person’s movements, senses or activities”, which, I hate to admit, can easily be applied to a person who is obese. The consequences of excess weight on the body are extensive, Weight gain can put excessive strain on joints, especially the knees, hips and back, and lead to degenerative joint disease, for example, osteoarthritis,  causing discomfort or pain. This means the individual finds physical activity difficult, especially when combined with breathlessness, feeling tired and having low energy; all of which can be present in people who are obese. Obesity can also lead to low self esteem and may cause depression in some cases, which once again, may limit the person’s ability to cope with everyday stresses including work and social environments. Long term, obesity can impair an individual’s quality of life with risks including:

  • Type 2 Diabetes
  • Coronary Heart disease
  • High Blood Pressure
  • Strokes
  • Fertility complications

Some of these conditions will dictate if the person can actually have a job and if they can, will probably still mean that time will be taken off work to attend doctor’s and hospital appointments.

So, I should be agreeing with the European Court of Justice’s ruling on obesity being classed as a disability, shouldn’t I? Well, no, I don’t. Taken to an extreme, this ruling could give some individuals the excuse they need to not face the drastic lifestyle changes needed to return themselves to a safe weight level and could encourage them to stay overweight, spiralling slowly to higher weight gain brought on by an already sedentary lifestyle and poor diet. I cannot see how this can benefit the person concerned, or the employer, who will be expected to pay out for upgrades in furniture; for example, larger, stronger chairs for the “disabled” employee. The estimated direct cost to the NHS for treating overweight and obesity in 2007 was £4.2 billion and I cannot see this figure improving until obesity figures decrease. This can only come about through lifestyle interventions, in other words, educating obese people to improve their diet, increase their physical activity and take responsibility for their condition so they become aware that their current way of life must change to prevent further risk of weight gain. How can this happen if obesity is being considered similar to losing a limb or being blind, both of which cannot be cured, in comparison to obesity, which most certainly can be?

Eating out made easier for consumers with food allergies

From the 13th December 2014 a new EU rule came into force in the UK that requires restaurants, takeaways and canteens to be more transparent regarding food ingredients. The EU Regulation 1169/2011 specifically targets the fourteen main allergens (See the information chart below by the Food Standards Agency) that affect 1.92 million people who have a food allergy in the U2c4352db3a2506711fa0ffea57c2f5fdK, not including the many individuals who have a food intolerance. This should mean that food allergy sufferers will be able to eat out in restaurants and order takeaways with more confidence that the food they are eating will not cause an allergic reaction resulting in symptoms ranging from mild discomfort to life threatening.

The Food Standards Agency (FSA) reported that the new regulations should help reduce people accidentally ingesting something that may cause an allergic reaction and reduce the number of deaths; on average 10 a year, and hospital admissions; nearly 5,000 a year. These figures show an increase of 87% since 2002. In partnership with Allergy UK the FSA reports that seventy percent of people affected by food allergies will avoid eating takeaway food as they do not feel confident that the food will be free of specific allergens, even if told by the food preparation staff that it does not contain that ingredient.

Food Allergies can not be treated or cured and only through careful management by avoiding the allergen, or by treating the symptoms after ingesting, can the allergy be controlled, so it is vital that the sufferer does not get exposed to any allergen which affects them.  It is clear that having a food allergy can have a profound effect on the quality of life for the person concerned as observed by Mills et al, and in addition, it can also affect the daily lives of family and friends. For example; food preparation at home, celebrating family birthdays, weddings and even the office Christmas party can prove hazardous for someone with a food allergy and make people think twice about ordering a takeaway or booking a restaurant for a meal. In an article by Bollinger et al, forty nine percent of caregivers (parents, grandparents, guardians etc)  considered social activities affected by the food allergy of the child in their care.

 

Obviously the new legislation has had both a monetary and time effect on food businesses, with new training for staff, allergen notices and information packs for menu ingredients needed to comply with the new regulations. I’m sure that a lot of business owners have considered this a burden and an inconvenience, especially businesses with a high staff turnover or menus that change frequently, as reported by The British Hospitality Association in August of this year. Surely, being able to provide transparency on food ingredients, keep food allergy sufferers safe, gain trust in the food prepared and enable more consumers access to takeaways and restaurants will only improve customer footfall for a food retailer. Most food businesses have recipes for staff to follow which will list ingredients, so by putting these in a folder for the customer to access should not prove too difficult a task…..unless there is something to hide! I personally feel that this is a positive step not just for food allergy sufferers but also for consumers as a whole. I certainly like to know what is in my food, so by having access to documentation listing ingredients for each dish on a menu means customers can make informed choices about their food.

The EU Regulation also applies to allergen labelling of food items; meaning packaged food has to have ingredient lists with the allergens clearly stated, possibly in bold, a different font or typeface. It will also bring into force obligatory nutrition labelling for foodstuff in 2016, so I predict that food packaging is set to become bigger to hold all the information needed…..or we will all need magnifying glasses to go shopping!

 

Campylobacter in our chickens….surprise surprise!

6055673653_b801207576_mThe BBC recently reported the high percentage of chickens sold in supermarkets that are infected with the organism, Campylobacter. This is a foodborne pathogenic microorganism which is responsible for more cases of food poisoning than any other in the UK, causing diarrhoea, abdominal pain, fever, nausea, vomiting, headaches and muscle pain. The Food Standards Agency (FSA) has released the findings from the first six months in an investigation that will last a year, to discover Campylobacter levels in whole chickens, sold in supermarkets, small retailers and butchers..

The results from the FSA showed that of the 1,995 chickens tested, 70% tested positive for the presence of campylobacter, with 18% having levels above the highest level for contamination (Above 1,000 colony forming units per gram >1000cfu/g). The findings actually show an increase in the presence of Campylobacter since the study began in February 2014, but this is probably due to it being warmer during the summer months when the latest samples were being taken. Campylobacter lives in the gut of healthy poultry, cattle and sheep, being part of the natural microflora occurring in the intestinal tract of these domesticated animals, so consequently can be present in many slaughtered animals on sale in supermarkets and smaller retailers. For more information on Campylobacter please see The Bad Bug Book produced by the U.S Food and Drug Administration.

While supermarkets need to do their best to reduce any foodborne pathogens, but please note, I say reduce, as it would be very unlikely that campylobacter could be completely eradicated from all foodstuffs and as seen by the results by the FSA, the levels will vary depending on external variations, such as temperature and storage conditions. Also, there must be some sense of food hygiene responsibility from the consumer, after all, it is not difficult to kill campylobacter, just ensure that the chicken is cooked thoroughly. Many people wash chicken before cooking, thinking to remove any impurities, though there is strong evidence to show this practice leads to cross contamination, which is a message that is being promoted by the FSA in their “Don’t wash raw chicken” Food Safety Awareness campaign this year. Not washing hands or equipment correctly before, during and after food preparation also has strong links to food poisoning cases as shown in a paper written in Applied and Environmental Microbiology and a study printed in the International Journal of Microbiology. Both these investigations show that cross contamination in domestic kitchens can be instrumental in allowing foodborne pathogens to spread via hand and equipment contact to ready-to-eat food and are consequently ingested by the food preparers and other members of the household.

I’m looking forward to results being published of Campylobacter infected chickens being sold by smaller retailers, such as butchers and seeing if there are similarities between these little guys and the big guns!