Time to Add a Social Element to our Filtration Systems?
Filters are a problem because they cannot, nor will they ever get everything. (See Content Filtration: A Little Dirt for Your Health) My biggest problem is that most teachers cannot request to have anything added (like access to educational Nings, wikispaces, or an educational blog on blogger - like mine) nor can they ask to have anything blocked. Someone at some company decides what is accepted and allowed, when in fact, it should be a curricular decision at the local level and there should be systems in place for reporting inappropriate sites and to request that some be allowed for educational use.
In the article, Internet Filter blocks education sites but not porn, says:
I am specifically thinking back on an incident we had in a past project where the Australian teacher had to take personal time to go to the public library to get on the Google Doc to add her students to the project matrix because there was no way to ask for things to be unblocked.
Looking at the new digg bar that they use, I have to wonder why filters can start deploying some sort of social filtration - if a site is blocked, people should be able to request that it be unblocked or submit it for review. If it should be blocked, then people should be able to report it. Right now, the reporting structures for firewalls are very much behind the scenes and only accessible to IT staff, why aren't we using Social Filtration for goodness sakes or at least considering to have a social component for reporting?
What we're missing in filtration is the human element and not just the human element of IT directors but that of curriculum directors and teachers -- and yes, even students. With good filters in place that have a social component, perhaps we could begin allowing educational resources from youtube and other resources to come through.
We block the world just doesn't cut it any more.
In the article, Internet Filter blocks education sites but not porn, says:
One site a Year 10 student opened while searching for a type of bird contained graphic sexual material and was only barred on Monday after inquiries from The Daily Telegraph.
George Cochrane said his school-aged son and daughter, who study by distance education from their farm in Grenfell, were horrified by the sites they could access.
Other educational sites and harmless web pages for the local member of parliament - and even Education Minister Verity Firth's own site - have been blocked by the filter.
I am specifically thinking back on an incident we had in a past project where the Australian teacher had to take personal time to go to the public library to get on the Google Doc to add her students to the project matrix because there was no way to ask for things to be unblocked.
Looking at the new digg bar that they use, I have to wonder why filters can start deploying some sort of social filtration - if a site is blocked, people should be able to request that it be unblocked or submit it for review. If it should be blocked, then people should be able to report it. Right now, the reporting structures for firewalls are very much behind the scenes and only accessible to IT staff, why aren't we using Social Filtration for goodness sakes or at least considering to have a social component for reporting?
What we're missing in filtration is the human element and not just the human element of IT directors but that of curriculum directors and teachers -- and yes, even students. With good filters in place that have a social component, perhaps we could begin allowing educational resources from youtube and other resources to come through.
We block the world just doesn't cut it any more.