Announcement

Collapse
No announcement yet.

How do you rate your knowledge of computers?

Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

    How do you rate your knowledge of computers?

    I'm way above average but still fee totally inadequate. In college I wrote programs for Atmel microcontrollers and Altera FPGAs. Throughout school and work I've written quite a bit of code, mostly for embedded systems applications, and also translated C code to assembly and down to machine code just as an exercise. I've used all of the major OSs (Windows, DOS, Linux, Apple), but still feel inadequate and don't have an intuitive grasp of tough questions like the following:


    -If an improvement causes the active state of a microcontroller to draw 2mA less current through some improvement, is that a revolutionary amount or just a small "nice work"

    -Why would a program work not work on a DOS emulator running on a Windows 10 PC but works fine on an actual 1990s PC running DOS

    -Can the way you partition a hard drive to dual boot Windows and Linux affect how each OS is able to access different parts of the hard drive?

    -Being able to look at the electronic hardware schematic for any router/modem and be able to understand everything that is going on

    -The processing of fiber optic signals in terms of # of bits, timing, voltage and current levels


    I feel like if you can't just speak to these questions fluently and answer the hell out of them, then you don't know **** about computers.

    And if you don't even understand the questions then you're the reason our society is doomed.

    #2
    Displays is another area. What kind of hardware and software is necessary to support each kind of display? LCD screens, VGA signals vs HDMI signals. What does the data look like? What voltage and current levels are necessary for VGA? What kind of data rates are necessary for VGA vs HDMI? Unless you have an intuitive feel for this, then you don't know computers.

    Comment


      #3
      Originally posted by Eddy Current View Post
      I'm way above average but still fee totally inadequate.

      -Can the way you partition a hard drive to dual boot Windows and Linux affect how each OS is able to access different parts of the hard drive?
      unless you're jiving, you're not above average.

      Comment


        #4
        Originally posted by Redsox View Post
        unless you're jiving, you're not above average.

        Lol you can't single out one issue and claim the entire field of com*****g revolves around it. I can play an Atmel microcontroller like a flute.

        Comment


          #5
          Being able to take any computer issue and know what's going on on the bit or hardware level is also necessary to be a master of com*****g.

          Comment


            #6
            Someday I will be a master. I'm only 32 still, and it takes years and years to master something so complex. I took a Java class in college and got an A with flying colors, but I have a long way to go still with that type of programming. There are so many areas of com*****g and so many levels. I am almost a master of the hardware level, and that's a good start. I scored A in all of my electronics and integrated circuit courses.

            Comment


              #7
              Originally posted by Eddy Current View Post
              I'm way above average but still fee totally inadequate. In college I wrote programs for Atmel microcontrollers and Altera FPGAs.

              ....

              I feel like if you can't just speak to these questions fluently and answer the hell out of them, then you don't know **** about computers.

              And if you don't even understand the questions then you're the reason our society is doomed.
              Throughout school and work I've written quite a bit of code, mostly for embedded systems applications, and also translated C code to
              assembly and down to machine code just as an exercise
              Me too, mostly for 32-bit ASM ( AT&T Sytnax , can sort of read Intel syntax but not as well as AT&T), for Linux. Machine code, you mean actual byte code? yes done that too. Ive used GDB and OBJDUMP for that. Although when debugging an ELF file in Linux, with GDB , the byte code / opcodes are shown along side the instructions. I believe there are automated python scripts to do that all for you now, extract the opcodes. I have been meaning to learn this process of extracting bytecode from debuggers from Windows OS's with PE files, never really got round to it, but eventually will take a look hopefully.

              EDIT:

              Btw just occured to me that translating C code to assembly is basic 101 stuff. Just throw it into a debugger! Open it in a Hex editior. Or use GCC to convert the C code into ASM directly. Easy as ****

              Tranlsating it into assembly code is nothing to brag about.


              ----hello.c---

              #include < stdio.h >
              #include < stdlib.h >

              int main(void)
              {
              printf("Hello there homeslize\n");
              return(EXIT_SUCCESS);
              }


              use GCC with the -S flag and you have assembly instructions.

              GCC -S hello.c

              and it will give u an assembly output. in the form an .s file.

              hardly rocket science eddy current, and hardly something to brag about!!

              Its like saying "Whoooo i know how to compile a file!"

              EDIT2:

              here you go son

              hello.s
              .file "hello.c"
              .text
              .section .rodata
              .LC0:
              .string "Hello there homeslize\n"
              .text
              .globl main
              .type main, @function
              main:
              .LFB5:
              .cfi_startproc
              pushq %rbp
              .cfi_def_cfa_offset 16
              .cfi_offset 6, -16
              movq %rsp, %rbp
              .cfi_def_cfa_register 6
              leaq .LC0(%rip), %rdi
              movl $0, %eax
              call print@PLT
              movl $0, %eax
              popq %rbp
              .cfi_def_cfa 7, 8
              ret
              .cfi_endproc
              .LFE5:
              .size main, .-main
              .ident "GCC: (Ubuntu 7.4.0-1ubuntu1~18.04.1) 7.4.0"
              .section .note.GNU-stack,"",@progbits

              Why am i doing this? this is really basic. Nothing to brag about Eddy Current. This is extremely basic.

              . I've used all of the major OSs (Windows, DOS, Linux, Apple), but still feel inadequate and don't have an intuitive grasp of tough questions like the following:
              Ive used Windows and Unix based systems. I havent for the life of me touched anything apple. Absolutely hate it. I have looked at some mobile application stuff by Apple, some basic IOS stuff. Nothing fancy or in depth. Basic at best

              -If an improvement causes the active state of a microcontroller to draw 2mA less current through some improvement, is that a revolutionary amount or just a small "nice work"
              More of an electronics questions

              -Why would a program work not work on a DOS emulator running on a Windows 10 PC but works fine on an actual 1990s PC running DOS
              DOS or MS-DOS , one ran as 8-bit and the other as 16-bit.

              Regardless, how ever, the answer is that of architecture. 8/16 bit applications wont work on a 64 bit version of Windows 10. Due to when the compiler on a 16 bit OS writes its object code and the linker links the program, the assembly instructions and byte code is suited that 8/16 bit OS. It wont work natively. Im sure if you google around you may find an emulator or a virtual machine someones built that will allow it

              -Can the way you partition a hard drive to dual boot Windows and Linux affect how each OS is able to access different parts of the hard drive?
              You need to be more specific, since if you have a Windows partition , the file system will be NTFS most likley and the Linux will be EXT4 . What are you trying to achieve? Are you referring to full/volume/parition disk encryption comming into play here

              -Being able to look at the electronic hardware schematic for any router/modem and be able to understand everything that is going on

              -The processing of fiber optic signals in terms of # of bits, timing, voltage and current levels
              More of an electronics questions, and im not an electronics guy and never have been.


              I just looked at Atmel microcontrollers. I found examples of C code. If i read the manual, i could probably whip up some C code for that, just a case of reading the instruction set.


              Ask some electronics guy about the electronics questions i missed
              Last edited by i_am_a_champ; 11-03-2019, 08:52 PM.

              Comment


                #8
                Even technology like a microwave oven is taken for granted.

                There is a large transformer in a microwave oven that steps up the 120 VAC to the thousand volt range necessary to create microwaves. Then in addition to that there is a microcontroller that handles all of the user inputs from the button panel.

                Sure they only cost $100 or so, but the average person would have 0 chance of making their own even if they were given $5,000 for research and development.

                Comment


                  #9
                  BoxingFan85 , is our resident Java messiah. Eddy Current says he got an A in Java class at College.

                  Boxingfan85 knows how much i do not love teh Java

                  Im sure , Eddy Current you can discuss Java in great detail with ol chap here, since Eddy Current u got an A

                  Comment


                    #10
                    If it wasn't for p.orn, I probably would have never learned how to even turn on a computer.

                    Comment

                    Working...
                    X
                    TOP