|
They are presented as alternatives. Pick one.
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
got it. for some reason I did not read it that way. Thanx to CP. it helps to hear many voices.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
It was a question:
Quote: What is more important to you, as the developer of code you want to share with the World:
- That the code is always able to be used for anything, without constraint
- That you have the ability to restrict the use of your code based on ethical concerns
cheers
Chris Maunder
|
|
|
|
|
i recognize i read out of context. i explained earlier. old farts have short term memory and read with less skill.
thanx, as i said CP helps. I missed it for about 2 weeks because illness. First thing I did was jump back in the lounge.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
Given that a good percentage of any code base today is borrowed/adapted from different sites on the Internet (including open source code), I feel the question is more of Accountability.
The final Accountability (including ethical accountability) of any software should rest with the current owner, releaser of that software, and not be transferred to the various internet sources from where extracts were taken. In other words, the "buck stops" at the person/company which released such code into production, deployment.
As an open source developer i will be unaware of the possible use cases of my code 40 years hence. And i need to be insulated against possible misuse.
|
|
|
|
|
Amarnath S wrote: i need to be insulated against possible misuse.
Therein lies the rub.
cheers
Chris Maunder
|
|
|
|
|
|
The OSI is talking about what defines open source. If a repository was restricted to use by people with the initials Q.C. there would be a lot of people who could not use it and calling it open source would be a stretch of the imagination.
As for putting ethical restrictions on software I would say only ethical people will respect it, the rest will use your code anyway and hope they do not get caught.
Personally if I felt strongly about something I would put in a disclaimer, rather than a licence clause. Something like: This code is not endorsed for use in cold blooded murder.
It makes the point without adding legal restrictions.
|
|
|
|
|
So would that suggest that morally you would be comfortable releasing code you knew could (and perhaps was) being used for Evil Purposes as long as you had a non enforceable "Don't use this for Evil Purposes" statement in the code?
At a practical level an expensive, water tight legally binding license is just as enforceable as your note in the minds of many, so I guess it comes down to:
Do you make a statement that has no teeth, or do you make a statement with teeth that will not really help the situation?
which reduces down to
Do you put the effort into a statement, knowing it will not actually help, or do you just mail it in?
Which is really
Do you put time and money into a statement as a statement unto itself, or just put a statement in so you can say you said "I told them not to"
This stuff is hard.
cheers
Chris Maunder
|
|
|
|
|
Chris Maunder wrote: This stuff is hard.
Only if you think idealism is attainable.
Is clean drinking water a good thing? What if the convenience of it coming out of a faucet makes it easier to water board someone?
|
|
|
|
|
I think there should be no bans on any type ammunition.
Guns are intended to be dangerous, regardless of the ammo you use.
(To keep it on topic...)
I don't use AI, but I have guns.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
I'm sitting here, sipping a beer, while giving you a very flat look.
Never change, John. The world will crumble.
cheers
Chris Maunder
|
|
|
|
|
But you laughed, right?
Consider your answer - remember, I have guns.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
Yes, absolutely 😅
cheers
Chris Maunder
|
|
|
|
|
Chris Maunder wrote: sipping a beer, while giving you a very flat look.
Is the beer flat, too?
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
|
From the Ars Technica comments:
I’m a Luddite (and So Can You!) | The Nib[^]
Two quotes from the comic that struck me:
> [William Morris] wanted people to take pleasure in their work rather than "mere toiling to live, that we may live to toil"
and, from the final frame:
> Questioning and resisting the worst excesses of technology isn't antithetical to progress. If your concept of "progress" doesn't put people at the center of it, is it even progress?
In that spirit of questioning: AI is obviously a further iteration of the industrial revolution, with all the disruption that that entails, but is AI really all there is to human intelligence? We shouldn't underestimate ourselves.
|
|
|
|
|
I agree that AI is an extension of the industrial revolution. But while the industrial revolution brought progress, it also generated a lot of unethical corporate behavior that eventually was regulated out of existence. I suspect the same will be happening to AI.
-Sean
----
Fire Nuts
|
|
|
|
|
Sean Cundiff wrote: it also generated a lot of unethical corporate behavior that eventually was regulated out of existence It WAS???
Religious freedom is the freedom to say that two plus two make five.
|
|
|
|
|
This kind of reminds me of the Lars Ulrich freakout over torrents and the basic availability of music that the Internet gave rise to. It didn't break Metallica.
Nor the music industry. But it did change it. The days of major record labels dictating who is popular are over. That's the good. The bad is obviously record sales, but artists (at least the ones I follow) have bridged the gap with more live shows.
And I think the AI thing will bake out similarly. People aren't going to pay to see AI perform (except maybe Captured By Robots fans). And knockoff tracks I think will still mostly be comedic or otherwise unserious, like the Johnny Cash cover of the Barbie song on youtube.
So I think these artists maybe - through misunderstanding and common fear of tech - are overblowing the situation.
That's just my opinion though - I have no crystal ball.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
honey the codewitch wrote: So I think these artists maybe - through misunderstanding and common fear of tech - are overblowing the situation.
Quite possibly, but the next 10-15 years are not going to be pretty until the dust settles.
-Sean
----
Fire Nuts
|
|
|
|
|
I certainly agree with that.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
I effectively restrict my professional code to certain arenas, as I will not work for example, on weapons systems.
I cannot control what my code under MIT license is used for however, and I produce a lot of that. If someone makes a missile guidance system with my JSON parser, well, I guess more power to them? I won't lose sleep over it, because I didn't make anything specifically for that purpose, and I don't feel morally or ethically obligated to control what other people do with my code.
The other thing is - the people I would be least comfortable with using my code - bad actors in general, whether they used it to create malware or anything I else I disagreed with - aren't the type of people to respect license agreements in the first place, so there's that to consider as well.
A long time ago I worked on productivity monitoring software for a workplace. I was in my twenties I wasn't considering how it was likely to be used. These days I wouldn't write such software because sadly, it's most likely going to be used to abuse employees. That's just how that software works when you go to squeeze every last bit of "productivity" out of someone's workday. Software micromanagement isn't much better than the meat based variety.
I get the same heebie jeebies from AI. It's so easy to abuse AI. Want to sidestep its filters? Ask your question using ASCII art. Or tell it to write War and Peace using only the word "pudding". So even attempts to make it ethical don't work. LLMs are just not a "safe" technology - but then neither is the Internet, but also look what the Internet has done (the bad as well as the good). I understand the Internet, and I've worked with it for long enough to temper what I produce such that I'm not unleashing something terrible upon the world. I can't say the same of anything I'd produce using LLMs or the like. I'd sooner just avoid it, and let other people be the ones to screw up the planet with it.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
"tell it to write War and Peace using only the word "pudding".
I'm staring and I'm staring and I'm staring at that sentence.
Must...not...
cheers
Chris Maunder
|
|
|
|
|
Researchers were doing things like that to get it to start dumping its training data at them.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|