Jump to content
Why become a member? ×

TheGreek

Member
  • Posts

    17,176
  • Joined

  • Last visited

  • Days Won

    28

Everything posted by TheGreek

  1. I have black knobs on order so that's only a temporary issue. The machine heads were a set of 4x black/silver, 1x gold and 1x silver so not a theme as such. I've swapped out the odd gold for a silver and "matched" the scheme. I'd like to find gold hardware to swap everything out for - bridge included.
  2. If I didn't already have a 6er I'd give this serious consideration. After all, £140 to experiment with a 6er can't be sniffed at.
  3. If you don't hand over the bass till he pays in full you can't be on the losing end of the deal. If you are worried about a PayPal scam don't accept a PP deposit- insist on cash only in full on collection.
  4. Andalusia in the Sky with Diamonds - The Beatles
  5. I'm tempted to round them off more.
  6. Berlin down the House - Talking Heads
  7. Prof Noel Sharkey says systems so infected with biases they cannot be trusted Henry McDonald Thu 12 Dec 2019 14.07 GMT An expert on artificial intelligence has called for all algorithms that make life-changing decisions – in areas from job applications to immigration into the UK – to be halted immediately. Prof Noel Sharkey, who is also a leading figure in a global campaign against “killer robots”, said algorithms were so “infected with biases” that their decision-making processes could not be fair or trusted. A moratorium must be imposed on all “life-changing decision-making algorithms” in Britain, he said. 'We are hurtling towards a surveillance state’: the rise of facial recognition technology Sharkey has suggested testing AI decision-making machines in the same way as new pharmaceutical drugs are vigorously checked before they are allowed on to the market. In an interview with the Guardian, the Sheffield University robotics/AI pioneer said he was deeply concerned over a series of examples of machine-learning systems being loaded with bias. On inbuilt bias in algorithms, Sharkey said: “There are so many biases happening now, from job interviews to welfare to determining who should get bail and who should go to jail. It is quite clear that we really have to stop using decision algorithms, and I am someone who has always been very light on regulation and always believed that it stifles innovation. “But then I realised eventually that some innovations are well worth stifling, or at least holding back a bit. So I have come down on the side of strict regulation of all decision algorithms, which should stop immediately. “There should be a moratorium on all algorithms that impact on people’s lives. Why? Because they are not working and have been shown to be biased across the board.” Sharkey said he had spoken to the biggest global social media and computing corporations, such as Google and Microsoft, about the innate bias problem. “They know it’s a problem and they’ve been working, in fairness, to find a solution over the last few years but none so far has been found. “Until they find that solution, what I would like to see is large-scale pharmaceutical-style testing. Which in reality means that you test these systems on millions of people, or at least hundreds of thousands of people, in order to reach a point that shows no major inbuilt bias. These algorithms have to be subjected to the same rigorous testing as any new drug produced that ultimately will be for human consumption.” As well as numerous examples of racial bias in machine-led decisions on, for example, who gets bail in the US or on healthcare allocation, Sharkey said his work on autonomous weapons, or “killer robots”, also illuminated how bias infects algorithms. “There is this fantasy among people in the military that these weapons can select individual targets on their own. These move beyond the drone strikes, which humans aren’t great at already, with operatives moving the drone by remote control and targeting individual faces via screens from bases thousands of miles away,” he said. “Now the new idea that you could send autonomous weapons out on their own, with no direct human control, and find an individual target via facial recognition is more dangerous. Because what we have found out from a lot of research is that the darker the skin, the harder it is to properly recognise the face. Rise of the racist robots – how AI is learning all our worst impulses “In the laboratory you get a 98% recognition rate for white males without beards. It’s not very good with women and it’s even worse with darker-skinned people. In the latter case, the laboratory results have shown it comes to the point where the machine cannot even recognise that you have a face. “So, this exposes the fantasy of facial recognition being used to directly target enemies like al-Qaida, for instance. They are not middle-class men without beards, of whom there is a 98% recognition rate in the lab. They are darker-skinned people and AI-driven weapons are really rubbish at that kind of recognition under the current technology. The capacity for innocent people being killed by autonomous weapons using a flawed facial recognition algorithm is enormous.” Sharkey said weapons like these should not be in the planning stage, let alone ever deployed. “In relation to decision-making algorithms generally, these flaws in facial recognition are yet another argument – along with all the other biases – that they too should be shut down, albeit temporarily, until they are tested just like any new drug should be.”
  8. We need to show this to singers we know. If Elvis and the like will be available to record again in the not too distant future they may need to up their game. Maybe they'll start to help load the van at long last.
  9. You'll be the only person ever to notice it....looking good.
  10. https://news.sky.com/story/ai-music-can-you-tell-if-these-songs-were-made-using-artificial-intelligence-or-not-12865174
  11. Could you live with a natural finish? Use the £30 offer reduction to buy some sand paper (you could pick up a palm sander for about £20 too)
  12. Love the colour - Gold hardwear is a nice touch too.
  13. There's something in what @LukeFRC says. It would certainly be a talking point and make the bass stand out in a crowd of F copies. For my tuppence worth I like to see the 2+2 headstock shortened by a couple of inches and a similar amount from the lower horn. Are 5ers/6ers available? Any images?
  14. I'd like that with a white scratchplate. I already have a Sterling Mini though so....
  15. Q: Would adding a strip of leather on the pickups affect the signal?
  16. Now finished with a leather scratchplate.
  17. So from here: to here: in a couple of weeks I'm not disappointed.
  18. I went with the tan ultimately. It kept the natural tone thing going which the white didn't. It was a bit of an experiment finding double sided tape which stuck to the bass - I ended up having to clean some of the Lemon oil off with a detergent spray for the Mammoth Powerful Grip tape to stick. A firm press/ rub over on the leather with the handle of a Stanley knife encouraged better adhesion. Going to let it settle before the final step of checking the wiring and plugging it in. Here it is:
  19. I'd rather have one of these than a Mustang. What is the RRP for this?? Wow!! https://www.ebay.co.uk/itm/115413481418?var=0&mkevt=1&mkcid=1&mkrid=710-53481-19255-0&campid=5338749374&toolid=20006&_trkparms=ispr%3D1&amdata=enc%3A1u_N4BKMiRXqX52cFljFdNA98&customid=GB_619_115413481418.145620983889~2062689180872-g_CjwKCAjw4ZWkBhA4EiwAVJXwqbXGr8WrVa29T45vfjl2kVLO9ZWgKqs-LYkWGbE6lKIsH5syHxrBKBoCSpcQAvD_BwE
×
×
  • Create New...