Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Kishotenketsu Programming? 66

mike_stay asks: "Imperative Programming follows closely the 'outline' style of writing most of us were taught in elementary school. Japanese, however, have a very hard time with that writing style, as they've been trained in the concept of kishotenketsu: stories are usually told by bouncing around between various points of view, which necessarily give different accounts; no attempt is made to say what 'really' happened. 'Good writing style' expects readers to draw the conclusions; writing that is too explicit is not valued. The writing, therefore, tends to be inductive: specific examples precede general principles. The closest thing I can think of to kishotenketsu in programming is functional programming or declarative languages, but then, I'm American. Would other readers point me at other languages with this type of 'eastern' feel?"
This discussion has been archived. No new comments can be posted.

Kishotenketsu Programming?

Comments Filter:
  • Good stuff (Score:4, Funny)

    by Old Uncle Bill ( 574524 ) on Monday February 03, 2003 @11:26AM (#5215497) Journal
    ...hence the reason my shelves are not filled with Japanese literature.
    • by Anonymous Coward
      ...hence the reason my shelves are not filled with Japanese literature.
      Not to mention that your shelves are filled with Dr. Suess.
      • > > ...hence the reason my shelves are not filled with Japanese literature.
        > Not to mention that your shelves are filled with Dr. Suess.

        "writing that is too explicit is not valued."

        Am I the only one who put these three ideas together and (upon realizing that Westerners typically don't read manga for the words) ended up with a vision of "Horton Tentacle-Rapes A Who"

        • Am I the only one who put these three ideas together and (upon realizing that Westerners typically don't read manga for the words) ended up with a vision of "Horton Tentacle-Rapes A Who"

          Yes. I was contemplating the Zen nature of the Cat in the Hat.

  • Ki Sho Ten Ketsu (Score:5, Interesting)

    by Gaijin42 ( 317411 ) on Monday February 03, 2003 @11:45AM (#5215591)
    Kishotenketsu is an interesting writing style, and it lends itself to some interesting applications, particularly in philosophy or politics. (Plato's The Republic, while a narritive, goes in a pattern simmilar to Kishotenketsu, in that several unrelated examples or narritives are told, and then a final "bring them together" dialog is presented which shows the relationship between the original narritives, and leads the user to the desired conclusion.

    Kishotenketsu is a result of the Japanese aversion to direct confrontation, and consiousness of status. (For another example of status consiousness, a listener of music should not say "That is good music" because that implies that the listener is a superior musician, and in a position to judge the player. Rather, the listener should say something like "Your music moves me"

    In any case, programs have a purpose, they DO something. While it certainly makes sense to abstract where applicable, being circuitous is not a good programming method.

    I suppose you could do something like go calculate Pi out to find the circumference of a circle that has a radius of the constant that you want to use to calculate your taxes.

    But that seems dumb to me :)

    On the other hand, this could be a good anti-piracy methodology, if you put a bunch of unrelated code in your serial number validation routine.
    • Re:Ki Sho Ten Ketsu (Score:3, Interesting)

      by KDan ( 90353 )
      On the other hand, this could be a good anti-piracy methodology, if you put a bunch of unrelated code in your serial number validation routine.

      Security through obscurity? Tsk tsk tsk... nope. Doesn't work (though it makes the crackers' task more complex).

      As far as I'm aware, a program is in itself an imperative. So it's not the language(s) that cause problems - it's the very nature of programming, ie giving orders to a computer.

      A computer is still, at the moment, a slave, a mathematical idiot savant that you have to order around very specifically to get it to do what you want. Maybe once AI's become clever and interaction with computer goes mostly through human-like AI's, this Kishotenketsu approach could be of some use as it is in real life, but until then, I don't really see any use in non-imperative commands (wtf could that mean anyway) to a computer.

      Daniel
    • Re:Ki Sho Ten Ketsu (Score:4, Interesting)

      by MrWa ( 144753 ) on Monday February 03, 2003 @12:49PM (#5215898) Homepage
      In any case, programs have a purpose, they DO something. While it certainly makes sense to abstract where applicable, being circuitous is not a good programming method.

      And many people in business (Americans mainly) would say that being nonconfrontational in conversation seems dumb as well.

      The question brings up a good point, especially as more programming leaves the U.S. and heads overseas. What has become generally accepted programming paradigms may start shifting to better match the thinking process of those writing the code. It may be a stretch (for now, anyway) to expect kishotenketsu programming to work - computers can't grok what you really mean without being specifically told - but we will probably see examples of "nonstandard" programming techniques and, eventually, languages in the future.

      • The problem is that (at least as stated in the original blurb) that the reader get to draw their conclusions, and that "writing that is too explicit is not valued". The problem is that a anything that is going to do millions of things a second (computers do that I think) needs to be told explicitly what to do. You could perhaps say that you have a standard interpritation, so it is like the same person reading the story every time perhaps.

        But any abiguity is very bad if 1000000's of desisions are made a second.
    • by Lovejoy ( 200794 )
      It makes for some fabulous literature.

      The problem with kishotenketsu isn't that it's so different from western thought. The problem is that most Japanese people don't learn it or any form of composition. Their education system is geared toward the 20th century. It produces excellent factory workers and "sarariman" but not great independent thinkers. It produces folks that know a lot, but don't necessarily know how to apply it in new ways.

      I would not be surprised to see this turn around in the very near future. It seems to me that Monbusho (the Ministry of Ed.) has seen the light and we may see some reforms come.

      In my experience, it takes them a long time to make a decision to change, but once the decisions have been made, they can move VERY fast. If they reform, the US had better get on the ball, or we'll be left in the dust.

      Finally, I have enormous respect for Japan. I think everyone who could possibly afford the trip should visit at least once. Japan is beautiful, complex, banal, overwhelming, frustrating, and fun all at the same time. It's a fabulous place.
    • When I worked at a Japanese software company, several programmers who could read English well told me that they preferred English manuals, even though they took an extra effort, because they were clear and precise.
    • Re:Ki Sho Ten Ketsu (Score:1, Informative)

      by Anonymous Coward
      Kishotenketsu is a result of the Japanese aversion to direct confrontation, and consiousness of status.

      No it isn't. Kishotenketsu is a direct evolution of classical Chinese literary formats, it has nothing to do with aversion or deference. It is an artistic format based not on delivering new information, but on referencing classical works already known to everyone. The reader must notice the references to understand the subtext.
      Anyway, I am glad that everyone liked my Powerpoint Syndrome essay, I wondered where all those hits were coming from!
    • "... a listener of music should not say "That is good music" because that implies that the listener is a superior musician, and in a position to judge the player. Rather, the listener should say something like "Your music moves me"

      This sounds an awful lot like "E Prime", which is English without the verb "to be", e.g. see this [generalsemantics.org]. It sounds extremely wacky when you first hear about it, but some of the rationale does make sense.
  • Uhm (Score:2, Interesting)

    by Koos Baster ( 625091 )
    Isn't functional programming (Miranda, Haskell, Gopher, ML, LISP, Scheme, Bla) a KIND of declarative programming? As is logical programming (Prolog, deductive databases).

    Well anyway, I'd like to point out that some of the most powerfull features of declarative programming and object orientation come together in Hassan Ait-Kaci's
    WildLIFE [ucl.ac.be].
    It features strong typed structured datatypes with inheritance, lambda terms, unification, resolution, backtracking,... you name it.

    It may be outdated, but it's definitely worth investigating if you're into perversely powerful programming languages and like the declarative methodology.
  • by Bastian ( 66383 ) on Monday February 03, 2003 @12:09PM (#5215691)
    I'm not sure I would call functional programming circuitous. Granted, one has to learn to think in an inductive manner in order to handle recursion, but I have found functional programs to be almost more directed than imperative programming. In an imperative language, you can sit down and twiddle a bit here and there, working towards the goal at your leisure. In a functional programming language, you have to never lose sight of your goal. Every line of code has to have a clear and well-defined role in achieving that goal.

    That's why I tend to prototype functions & algorithms in a more lisp-style pseudocode even when I'm writing in C. I find that I end up writing much cleaner code when I prototype in functional pseudocode.
    • I find that I end up writing much cleaner code when I prototype in functional pseudocode.

      Could you give a simple example of what you mean? (lameness filter permitting of course)

      I have only just got my head around object orientation!

      Thanks,

      -ed

      • Take something involving a depth first search of some state space. This is generally an inherently recursive problem. Pesudocode that resembles Scheme (or Lisp if you're lazy, or FORTH if you're frisky, etc.) tends to force you to work within the recursive nature of the problem because you get beaten over the head with what my professors used to like to call "elegance." (Here, elegance is a euphemism for "no variables, no loops, everything is recursive, functions are first class.")

        If one were approaching the problem from another paradigm of programming and using the standard "Learn to Program in One Week" book spaghetti-pseudo, it would be easy to fall into using some horrible mess of overlapping loops or something like that. I've seen people turn relatively simple problems into gitantic tangles of wild pointers just because they didn't think recursively.
        • Take something involving a depth first search of some state space. This is generally an inherently recursive problem. ... If one were approaching the problem from another paradigm of programming and using the standard "Learn to Program in One Week" book spaghetti-pseudo, it would be easy to fall into using some horrible mess of overlapping loops or something like that. I've seen people turn relatively simple problems into gitantic tangles of wild pointers just because they didn't think recursively.

          Building a non-recursive recursive algorithm is simple. The only structure you need is a stack. It's the *exact* same thing your "resursive" algorithm is doing, the programmer is just being explicit about it.

          "Recursing" is as simple as pushing your state object onto the stack. Returning is as simple as popping. When the stack==0 and you return you really return. And guess what? No stack overflow! When you reach the end of your stack, you grow it. One minor bonus is not having to deal with function calls (at the cost of handling your own stack).

          Now, that's not to say that for somethings true recursion isn't great. But "gigantic tangles of wild pointers" it doesn't need to be.
          • So you trade code recursive for data recursive. It's still recursive. Often, data recursive code is more complex to get right, too.

            As the parent post noted, some algorithms are inherently recursive. Unfortunately, many teaching examples given to beginners don't need recursion.

            Iterative tasks can be made recursive, but there's little point. For example, consider the classic "factorial()" and "fibonacci()" examples, which are really just iterative examples in which previous iterations feed future iterations. Although these tasks are "recursive" in the mathematical sense -- later elements are defined in terms of previous elements -- the computation itself is iterative.

            Tasks which "fork", such as depth-first traversals of complicated structures, aren't simple iterative processes, since there is an implicit "go back" step that rewinds to a previous context where the fork occurs. You can translate these into a loop that lacks recursive function calls, but only if you sprout a data stack to keep track of these forking points. You've really just traded one stack for another and have not transformed the problem.

            --Joe
            • Tasks which "fork", such as depth-first traversals of complicated structures, aren't simple iterative processes, since there is an implicit "go back" step that rewinds to a previous context where the fork occurs. You can translate these into a loop that lacks recursive function calls, but only if you sprout a data stack to keep track of these forking points. You've really just traded one stack for another and have not transformed the problem.

              And of course that was what I was trying to point out. The original author said "it would be easy to fall into using some horrible mess of overlapping loops or something like that. I've seen people turn relatively simple problems into gitantic tangles of wild pointers just because they didn't think recursively." My point was simply that A) It's not that complex and B) It's not just something that beginner programmers do (infact, understanding a stack, and understanding recursion is just using a stack would require a more advance programmer).

              I was just trying to point out just because a problem is a "recursive" problem doesn't mean that one needs to solve it using the traditional recursive approach, and in fact, that there are good reasons not to.

              And we shouldn't limit ourselves just to creating a data stack. If one uses threaded trees you don't even NEED the stack, it's all stored within your tree already. Therefore one can solve the problem of walking a tree (doing "depth-first traversals of complicated structures") w/o any sort of stack at all.
              • Threaded trees are kinda cool, but they do have some additional storage associated with them. Specifically, you have to have some way of distinguishing between a pointer to a child and a pointer back up into the tree. Also, how do you manage to get a post-order traversal out of one?

                Back to the topic: Once you make the translation from code-recursion to data-recursion, you also open yourself up to other neat transformations. For instance, the difference between a depth-first search and a breadth-first search is simply the difference between a stack and a queue. If you write your code appropriately, you can switch between one or the other painlessly. And if you use a priority queue, you can have a loop which morphs between the two or gives you something in-between, just by changing the priority function. (I find this last structure useful for maze generation.)

                --Joe

      • I find that I end up writing much cleaner code when I prototype in functional pseudocode.

        Could you give a simple example of what you mean?

        I suspect he's talking about something like this (loosly paraphrased from an intro to Haskell, IIRC):

        Problem: write a sort using the quicksort algorithm.

        Imperative pseudocode:

        1. The quicksort procedure takes an array, and an upper and lower bounds.
        2. If the lower bound isn't less than the upper bound, it does nothing.
        3. Pick a pivot value from an arbitrary point in the array between the upper bound and the lower bound.
        4. Start two pointers walking through the array, one from each end.
        5. Move the bottom pointer up till you find an item greater than or equal to the pivot value.
        6. Move the upper pointer down until you find a value less than the pivot value or until it meets the lower pointer.
        7. If the pointers haven't met, swap the elements they point to and repeat from step 5.
        8. quicksort the portion of the array between the lower bound and the item before the point where the pointers met.
        9. quicksort the portion of the array between the item after where the pointers met and the upper bound.
        Functional pseudocode:
        • Sorting a empty list give you an empty list.
        • Sorting a list (call it L) with at least one item in it (call it x) gives you a list which is the concatination of a preamble, N occurrences of x, and a tail, where:
        • the preamble is a sorted list of everything in L less than x
        • N is the number of times x occurs in L.
        • the tail is a sorted list of everything in L greater than x
        -- MarkusQ
        • the preamble is

          Unless you construct the preamble and tail in-place, which few functional languages make easy, you have to repeatedly create new objects "preamble" and a new object "tail". Then, to get rid of unreachable objects, you need to run GC in the background, which AFAIK doesn't work reliably in a real-time setting on an embedded system such as a a game console with a 16.78 MHz processor and 384 KB of RAM [gbadev.org]. (But then, I may not know far enough.)


          • Unless you construct the preamble and tail in-place

            Agreed. The point here was pseudo-code. The original poster said that they found they got better results if they wrote functional pseudo code. This makes sense: if you specify the results you want first, and then work out the most efficent way to get them, you are more likely to succeed than if you start by writing something that is efficent and just hope that it does what you want.

            BTW, in most cases you don't need to use garbage collection to implement these sorts of functions (although it may be one of the most obvious implementations). Here, for example, you could do everything on a stack, making cleanup easy. On some systems this may even be faster than doing it in place (think dirty cache/locality of reference issues). Or you could flop back and forth between two buffers.

            The point being that these are all how-to-do-it questions which should wait until after what-to-do is known.

            -- MarkusQ

          • If you want a functional languages and performance go with Forth. You can protype the whole thing in a hige language Forth and then drop down to a lower level that is essentially executing raw in the machine for the often repeated routines.
      • Some examples may exist- but remember that there are counterexamples too. Some problems are not inherently recursive or functional, and trying to adapt functional programming styles just brings pain.

        Humans are (mostly) imperative creatures. Computers are completely imperative machines, although they can emulate functional behavior. Sometimes its safer to stay closer to the natural abilities of your hardware, rather than going too far beyond.

        Try sketching out the overall structure of something like Apache or SuperMario64 as a functional program to see what I mean.

        The flood-fill algorithim in bitmapped graphics is easy to describe functionally, but running the program might swamp your computer. (Depending on smartness of your compiler, and other factors).

        Even a compiler (seemingly a class of software most closely resembling a unary mathematical function) is difficult to write purely functionally. I recall when a respected college professor attempted this. The result was essentially correct, but mind-bendingly ugly, and he'd spent much more time than needed.

        Those things that are drastically easier to describe in a functional style (such as the Quicksort described by another respondant) tend to be fairly brief algorithims, and heavy documentation on them is easily accessible,
        • Actually writing servers in functional languages is not a big deal, especially when the language has libraries already written, such as Erlang. In fact, servers is something Erlang was designed for.

          Nintendo uses Lisp for game programming, maybe even Mario64.
          • "already has libraries written" = "someone else had to write it". That tells nothing about how difficult it was.

            I could similarly say that "Quicksort can be implemented as 2 lines of C code (because a library's [enteract.com] already written)".

            Besides, Erlang isn't really a purely functional language. Well, "pure functional" languages hardly ever amount to more than academic toys, but there are other languages that come a lot closer.

            Here's a one line Erlang program:
            hello() -> io:fwrite("hello\n").

            See that "fwrite"? It means "write to the file called io". Telling the computer to do something- that's imperative.

            Here's the equivalent in ML [bell-labs.com]:
            val it = "hello" : string
            The criticial distinction here is that there's no verb-equivalent. In a functional style, everything is either a statement of fact- standalone like a noun (constant), or relative to another like an adjective (function). ML isn't purely functional either, but it looks closer.

            Of course, any useful language can be used for programming in either imperative or functional styles (with some small additional awkwardness). Take the QuickSort [slashdot.org] example someone gave. Even though they're called "imperative" languages, developers working in C++, Java, or Perl would tend to implement something that directly follows the functional outline.

            (Note that some academics will say that the criticial distinction of a functional language is that functions are treated as first-class objects. "functional vs non-functional". I'm using the simpler definition- functions as per math functions- which treats the distinction as "functional vs imperative")

            Nintendo uses Lisp for game programming, maybe even Mario64.

            Of course I haven't seen the code, but if they're at all like most game developers, Mario64 isn't written in Lisp- it is C and assembler. "Behaviors of monsters inside the game" may be in Lisp, that's a fairly common approach. But not "the game" itself. Things like "load the texture resources from disk, calculate normals for polygonal models, run a 40hz event loop on gamepad input"... they're just too hard to describe functionally.
        • Not sure what you mean:

          Apache high level:

          http_response = resolution(http_request)

          http_request = struct:
          {page, post information, get information, session data}

          something like that?

          SuperMario
          next_obsticle = pop(current_obsticle)

  • by lyoz ( 554482 ) on Monday February 03, 2003 @12:45PM (#5215868) Journal
    On the risc of being a lil offtopic, I would like to share the following observations.
    Interesting how our brain works. Powerpoint Syndrom [weblogs.com] is a very fine observation. English is not my first language( and that explains all the gramatical and spelling mistakes :-) ). However, it has been the language of my education. One of the tips'n'tricks that my english teacher from school told me was that in order to imporve my english, I should start thinking in it. The idea was that we our brain uses the first language (urdu in my case), and so our thoughts are limited by the expressions we can come up with using the language. Whatever language we learn afterwards is a process of run-time translations. But then with the passage of time we master other languages and we can train our brain to think in all these languages.

    So whats the point in pointing out the obvious. We are taught the basics, like the alpabets, and then we build upon these basics. However our knoweldge, our way of thinking will always be limited by basics. Its just like the decimal system. There are only 10 digits. You ll only be re-using them again and again. Consider how difficult it would have been if we were all to use the binary system for our daily mathematics. Same goes for the possiblity of using hexa or maybe centa number systems.

    Edward D. Bono pointed out in his book on lateral thinking the very same things. We are taught a basic way of thinking, of reasoning. Thats limits us in looking at things from a different prospective.

    An interesting paper, Beyond Language: Cultural Predispositions in Business Correspondence [nmt.edu] disucsses this issue.

    • > The idea was that we our brain uses the first language (urdu in my case), and so our thoughts are limited by the expressions we can come up with using the language.

      Funny, most of my highest-order thoughts are done without any language at all. The language-based running stream of thought is merely the chat room of the intellect, not the substance.

    • What you're refering to is the Sapir-Whorf hypothesis, the statement that human thought is limited by the language of the thinker. I think it's mostly discredited anymore, though I could be mistaken. Some interesting stuff, though.
      • by Cuthalion ( 65550 ) on Monday February 03, 2003 @03:12PM (#5216798) Homepage
        The "strong form" of the Sapir-Whorf hypothesis, that language restricts thought, is not taken very seriously by linguists anymore. The "weak form", that language influences thought is pretty widely accepted.
        • For me it's, Language restricts expression.

          My English, is kinda, well crap and my grammar is poor at the best of times.

          My thoughts tend to be more empathetic, for me it's the answer isn't important, it's how you arrived at that answer. Words do not just have 'meaning' they have emotion and history, If I think about it, it can take me as long to work out if a word is correct, as it did to write the while document. (so I kinda go for free flow).

          I have thoughts and ideas that are held in abstraction and are more or less impossible to explain because of the complexity surrounding them.

          I'm glad my English is so shit, I beleive my thoughts are richer for it.
    • It seems "obvious" that language controls thought and it is also wrong. The Sapir-Whorf hypothesis has been shown to be incorrect, but it is also quite easy to test yourself as well.

      When it comes to foreign languages the best way to learn is to train it a lot. Preferably go to a country where it is spoken and live there for a few months/years and you will find yourself thinking in that language i no-time. (Or after a few months at least.)

      Furthermore, if you speak a language which has only one word for snow does that mean that you can't differ between different types of snow? (Wet, dry, with hard layer on top etc.) Of course not! You only need experience.

      If you train in computer science/engineering (which I do) you will quickly develop your ability to count in number systems other than decimal. (Binary also has the bonus that addition and subtraction are the same thing. ;-) If you had been taught to count binary/hex whatnot from the beginning you wouldn't find it harder than learning to count in dec. It's all aquired skills.
      • It's lame to reply to oneself but I found a reference discrediting Sapir-Whorf: Steven Pinker [mit.edu].
      • In binary addition and substraction are not the friggin same??!! 1-0 = 0 0-1 = -1 Spot the difference!
        • First off your example is not correct. It should be

          1-0=1; 0-1=-1=1

          Because in a binary system -1 is not a valid number. This is only correct for bitwise operations though.

          And my point was that in GF(2^n) ie boolean algebra:

          a+b=a-b

          This is true as long as you have unsigned values. (Which you often have when doing binary calculations.)
          • Because in a binary system -1 is not a valid number.

            Why not? -1 is just as valid in binary as in octal, decimal, or hex. "binary" just means base-two.

            Maybe you think it implies the values are unsigned, which is the simplest obvious way to handle numbers in a computer. Some programmers observe how their computers behave, and assume this is the truth. This can lead them to pronounce absurdities like 65536+2=1. But computers are limited- the definitions of math provide the real authoritative true results (unless you state a specifc set of alternative assumptions).
            • If you'd bothered to read the rest of my post you would have seen that I already adressed that issue. Even following the sentence you quoted I wrote that "this is only true for the bitwise case". Now if you have an idea for how to represent -1, 0 and 1 in one bit of information then pray tell me, I'm sure a lot of other academics would be interested as well.

              And I also pointed out that it assumed that the value was unsigned. And for a lot of applications /this is a valid assumption/. And not only for direct computer purposes but also for most parts of information theory such as crypto, compression and coding. When you work with these problems you virtually /always/ do it with in Galoir fields, and then there is no need for negative numbers. Because each negative number is equal to a positive number.

              And 65536+2=1 is not at all an absurdity, it's just done modulous 65537. (You probably intended to write 65535+2=1, since 2^16 is 65536.)

              If you haven't studied abstract algebra you probably feel that counting somthing modulous is "incorrect" but the fact is that it is a more fundamental branch of mathematics than what you do in Z or R. The idea that there is an infinite number of numbers is more complex than having a limited amount. And in many applications it is more useful to work with only a subset of numbers.
          • 1-0 = 0. Oh my God! I will definitely have to go to grade school again ;(

            Anyways, the whole point is that binary unsigned vs binary signed has to do with representation in memory. computers usually use 2s complement with a modulo of 128, 32768 or 2^31. But -1 is a valid binary number.
    • by Anonymous Coward
      If we were taught math in binary, we could count to 111111111 using our fingers (or 1023 in base-10)
    • I don't think it is necessarily true that the brain thinks only what the tounge can say.

      Sometimes people hold unreasoned opinions. That is, they think something for reasons they can't or at least haven't yet articulated. For instance I might hold that 789/5 = 2.92 but I'd have done my math wrong. I might also use different definitions of words to think than you do. Unless I define each and every word to myself in my head and then define all the words used in the definitions of those words and in the definitions of the words used in the definitions of the definitions ad nauseum, then I am thinking something I can not articulate.

      Sometimes we do not articulate thoughts to ourselves because it is emotionally unsettling or we are too lazy.

  • some suggestions (Score:3, Informative)

    by jovlinger ( 55075 ) on Monday February 03, 2003 @12:46PM (#5215875) Homepage
    I would suggest you look into aspect oriented programming, or perhaps multi-dimensional separation of concerns.

    A good place to start is aosd.net
  • Obvious answer? (Score:5, Interesting)

    by Dr. Photo ( 640363 ) on Monday February 03, 2003 @01:48PM (#5216205) Journal
    Ruby [ruby-lang.org] is a programming language that was born and raised in its native Japan, which means it may very well be, by definition, what you say you're looking for.

    Incidentally, Ruby, though purely-OO, supports nifty things like true closures, and you can end up doing functional programming without realizing it at first [Ruby, of course, is designed with this sort of thing in mind]. It was the realization that I was doing this (or something very close to it), in conjunction with Paul Graham's essays [paulgraham.com] that got me interested in Scheme [schemers.org] (a sleek, lightweight dialect of Lisp).

    So, perhaps the only real answer is to learn as many interesting programming languages as you can, and use the broadened perspective you gain to make an informed decision for yourself.

  • by Anonymous Coward
    I know it's not eastern but perl may as well be Dutch, and Java has got to be (verbose) Latin.

    :P *runs away*

  • by kapital ( 471240 ) on Monday February 03, 2003 @02:06PM (#5216322)
    Kishotenketsu is a result of the Japanese aversion to direct confrontation, and consiousness of status.

    exactly! what you'd end up with is an api that has 100 different functions all of which can only individually hint at the general direction of the output without making any kind of assertion about what is actually the correct output given your parameters. and care would be taken so that one function stands out as an obvious alternative to the others, except for maybe the more "senior" functions that have been built into the library from some time back. but, that's ok, because those functions, while the most obvious to use, would be the least useful in terms of the quality and clarity of values returned.

    this gets no play in my ride.
    • And this is different from Microsoft's APIs how?
      • You think he's kidding?
        BOOL GetTextExtentPoint(HDC hdc, LPCTSTR lpString, int cbString, LPSIZE lpSize);
        "The GetTextExtentPoint function computes the width and height of the specified string of text. "
        BOOL GetTextExtentPoint32(HDC hdc, LPCTSTR lpString, int cbString, LPSIZE lpSize);
        "The GetTextExtentPoint32 function computes the width and height of the specified string of text."
        BOOL GetTextExtentPointI(HDC hdc, LPWORD pgiIn, int cgi, LPSIZE lpSize);
        "The GetTextExtentPointI function computes the width and height of the specified array of glyph indices."
        BOOL GetTextExtentExPoint(HDC hdc, LPCTSTR lpszStr, int cchString, int nMaxExtent, LPINT lpnFit, LPINT alpDx, LPSIZE lpSize);
        "The GetTextExtentExPoint function retrieves the number of characters in a specified string that will fit within a specified space and fills an array with the text extent for each of those characters."
        BOOL GetTextExtentExPointI(HDC hdc, LPWORD pgiIn, int cgi, int nMaxExtent, LPINT lpnFit, LPINT alpDx, LPSIZE lpSize);
        "The GetTextExtentExPointI function retrieves the number of characters in a specified string that will fit within a specified space and fills an array with the text extent for each of those characters."
  • Western Example (Score:3, Insightful)

    by zulux ( 112259 ) on Monday February 03, 2003 @02:25PM (#5216435) Homepage Journal
    Ken Kesey's Sometimes a Great Notion is written in this style. Eatch section has is own viepoint of the central story - sometimes it's dialog, sometimes it's sureal. Sometimes the same event is playedout but with diferent outcomes depending on who's viewpoint.

    It's a good read as well, and an "English Teacher's" favorite so beware - there's a ton of symbolism if you choose to read into it a bit too much.

  • > (C More or Less) [ic.ac.uk]A subject-oriented language (SOL). Each C+- class instance, known as a subject, holds hidden members, known as prejudices, agendas or undeclared preferences, which are impervious to outside messages; as well as public members, known as boasts or claims.

    The following C operators are overridden as shown:

    > better than
    > way better than
    forget it
    ! not on your life
    == comparable, other things being equal
    !== get a life, guy!

    C+- is strongly typed, based on stereotyping and self-righteous logic. The Boolean variables TRUE and FALSE (known as constants in other, less realistic languages) are supplemented with CREDIBLE and DUBIOUS, which are fuzzier than Zadeh's traditional fuzzy categories. All Booleans can be declared with the modifiers strong and weak. Weak implication is said to "preserve deniability" and was added at the request of the DoD to ensure compatibility with future versions of Ada. Well-formed falsehoods (WFFs) are assignment-compatible with all Booleans. What-if and why-not interactions are aided by the special conditional EVENIFNOT X THEN Y.
  • If you take it to its extreme, Object Oriented Programming can be seen to be declarative. You state that "their is an object which has the following behaviours". From that Object you derive more sophisticated Objects and describe their behaviours. If you get the Objects right, the bits of imperative code in the implementation are "trivially" simple. So that when you get to the end, the "main program" simply instantiates a single object. The program itself does not comne from an explicit order but as a consequence of instantiating the main program object, which instantiates other objects it needs, which... and so on.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...